Search results for: detecting architectural tactics
556 Planning Fore Stress II: Study on Resiliency of New Architectural Patterns in Urban Scale
Authors: Amir Shouri, Fereshteh Tabe
Abstract:
Master planning and urban infrastructure’s thoughtful and sequential design strategies will play the major role in reducing the damages of natural disasters, war and or social/population related conflicts for cities. Defensive strategies have been revised during the history of mankind after having damages from natural depressions, war experiences and terrorist attacks on cities. Lessons learnt from Earthquakes, from 2 world war casualties in 20th century and terrorist activities of all times. Particularly, after Hurricane Sandy of New York in 2012 and September 11th attack on New York’s World Trade Centre (WTC) in 21st century, there have been series of serious collaborations between law making authorities, urban planners and architects and defence related organizations to firstly, getting prepared and/or prevent such activities and secondly, reduce the human loss and economic damages to minimum. This study will work on developing a model of planning for New York City, where its citizens will get minimum impacts in threat-full time with minimum economic damages to the city after the stress is passed. The main discussion in this proposal will focus on pre-hazard, hazard-time and post-hazard transformative policies and strategies that will reduce the “Life casualties” and will ease “Economic Recovery” in post-hazard conditions. This proposal is going to scrutinize that one of the key solutions in this path might be focusing on all overlaying possibilities on architectural platforms of three fundamental infrastructures, the transportation, the power related sources and defensive abilities on a dynamic-transformative framework that will provide maximum safety, high level of flexibility and fastest action-reaction opportunities in stressful periods of time. “Planning Fore Stress” is going to be done in an analytical, qualitative and quantitative work frame, where it will study cases from all over the world. Technology, Organic Design, Materiality, Urban forms, city politics and sustainability will be discussed in deferent cases in international scale. From the modern strategies of Copenhagen for living friendly with nature to traditional approaches of Indonesian old urban planning patterns, the “Iron Dome” of Israel to “Tunnels” in Gaza, from “Ultra-high-performance quartz-infused concrete” of Iran to peaceful and nature-friendly strategies of Switzerland, from “Urban Geopolitics” in cities, war and terrorism to “Design of Sustainable Cities” in the world, will all be studied with references and detailed look to analysis of each case in order to propose the most resourceful, practical and realistic solutions to questions on “New City Divisions”, “New City Planning and social activities” and “New Strategic Architecture for Safe Cities”. This study is a developed version of a proposal that was announced as winner at MoMA in 2013 in call for ideas for Rockaway after Sandy Hurricane took place.Keywords: urban scale, city safety, natural disaster, war and terrorism, city divisions, architecture for safe cities
Procedia PDF Downloads 484555 Research on the Role of Platelet Derived Growth Factor Receptor Beta in Promoting Dedifferentiation and Pulmonary Metastasis of Osteosarcoma Under Hypoxic Microenvironment
Authors: Enjie Xu, Zhen Huang, Kunpeng Zhu, Jianping Hu, Xiaolong Ma, Yongjie Wang, Jiazhuang Zhu, Chunlin Zhang
Abstract:
Abstract: Hypoxia and dedifferentiation of osteosarcoma (OS) cells leads to poor prognosis. We plan to identify the role of hypoxia on dedifferentiation and the associated signaling pathways. We performed a sphere formation assay and determined spheroid cells as dedifferentiated cells by detecting stem cell-like markers. RNAi assay was used to explore the expression relationship between hypoxia inducible factor 1 subunit alpha (HIF1A) and platelet derived growth factor receptor beta (PDGFRB). We obtained PDGFRB knockdown and overexpression cells through lentiviral infection experiments and the effects of PDGFRB on cytoskeleton rearrangement and cell adhesion were explored by immunocytochemistry. Wound-healing experiments, transwell assays, and animal trials were employed to investigate the effect of PDGFRB on OS metastasis. Dedifferentiated OS cells were found to exhibit high expression of HIF1A and PDGFRB, and HIF1A promoted the expression of PDGFRB, subsequently activated ras homolog family member A (RhoA), and increased the phosphorylation of myosin light chain (MLC). PDGFRB also enhanced the phosphorylation of focal adhesion kinase (FAK). The OS cell morphology and vinculin distribution were altered by PDGFRB. PDGFRB also promoted cell dedifferentiation and had a significant impact on the metastasis of OS cells both in vitro and in vivo. Our results demonstrated that HIF1A up-regulated PDGFRB under hypoxic conditions, and PDGFRB regulated the actin cytoskeleton by activating RhoA and subsequently phosphorylating MLC, thereby promoting OS dedifferentiation and pulmonary metastasis.Keywords: osteosarcoma, dedifferentiation, metastasis, cytoskeleton rearrangement, PDGFRB, hypoxia
Procedia PDF Downloads 47554 Comparison between High Resolution Ultrasonography and Magnetic Resonance Imaging in Assessment of Musculoskeletal Disorders Causing Ankle Pain
Authors: Engy S. El-Kayal, Mohamed M. S. Arafa
Abstract:
There are various causes of ankle pain including traumatic and non-traumatic causes. Various imaging techniques are available for assessment of AP. MRI is considered to be the imaging modality of choice for ankle joint evaluation with an advantage of its high spatial resolution, multiplanar capability, hence its ability to visualize small complex anatomical structures around the ankle. However, the high costs and the relatively limited availability of MRI systems, as well as the relatively long duration of the examination all are considered disadvantages of MRI examination. Therefore there is a need for a more rapid and less expensive examination modality with good diagnostic accuracy to fulfill this gap. HRU has become increasingly important in the assessment of ankle disorders, with advantages of being fast, reliable, of low cost and readily available. US can visualize detailed anatomical structures and assess tendinous and ligamentous integrity. The aim of this study was to compare the diagnostic accuracy of HRU with MRI in the assessment of patients with AP. We included forty patients complaining of AP. All patients were subjected to real-time HRU and MRI of the affected ankle. Results of both techniques were compared to surgical and arthroscopic findings. All patients were examined according to a defined protocol that includes imaging the tendon tears or tendinitis, muscle tears, masses, or fluid collection, ligament sprain or tears, inflammation or fluid effusion within the joint or bursa, bone and cartilage lesions, erosions and osteophytes. Analysis of the results showed that the mean age of patients was 38 years. The study comprised of 24 women (60%) and 16 men (40%). The accuracy of HRU in detecting causes of AP was 85%, while the accuracy of MRI in the detection of causes of AP was 87.5%. In conclusions: HRU and MRI are two complementary tools of investigation with the former will be used as a primary tool of investigation and the latter will be used to confirm the diagnosis and the extent of the lesion especially when surgical interference is planned.Keywords: ankle pain (AP), high-resolution ultrasound (HRU), magnetic resonance imaging (MRI) ultrasonography (US)
Procedia PDF Downloads 190553 Decision Making in Medicine and Treatment Strategies
Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi
Abstract:
Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.Keywords: decision making, medicine, treatment strategies, patient
Procedia PDF Downloads 579552 Reactive and Concurrency-Based Image Resource Management Module for iOS Applications
Authors: Shubham V. Kamdi
Abstract:
This paper aims to serve as an introduction to image resource caching techniques for iOS mobile applications. It will explain how developers can break down multiple image-downloading tasks concurrently using state-of-the-art iOS frameworks, namely Swift Concurrency and Combine. The paper will explain how developers can leverage SwiftUI to develop reactive view components and use declarative coding patterns. Developers will learn to bypass built-in image caching systems by curating the procedure to implement a swift-based LRU cache system. The paper will provide a full architectural overview of a system, helping readers understand how mobile applications are designed professionally. It will cover technical discussion, helping readers understand the low-level details of threads and how they can switch between them, as well as the significance of the main and background threads for requesting HTTP services via mobile applications.Keywords: main thread, background thread, reactive view components, declarative coding
Procedia PDF Downloads 25551 Effect of Concentration Level and Moisture Content on the Detection and Quantification of Nickel in Clay Agricultural Soil in Lebanon
Authors: Layan Moussa, Darine Salam, Samir Mustapha
Abstract:
Heavy metal contamination in agricultural soils in Lebanon poses serious environmental and health problems. Intensive efforts are employed to improve existing quantification methods of heavy metals in contaminated environments since conventional detection techniques have shown to be time-consuming, tedious, and costly. The implication of hyperspectral remote sensing in this field is possible and promising. However, factors impacting the efficiency of hyperspectral imaging in detecting and quantifying heavy metals in agricultural soils were not thoroughly studied. This study proposes to assess the use of hyperspectral imaging for the detection of Ni in agricultural clay soil collected from the Bekaa Valley, a major agricultural area in Lebanon, under different contamination levels and soil moisture content. Soil samples were contaminated with Ni, with concentrations ranging from 150 mg/kg to 4000 mg/kg. On the other hand, soil with background contamination was subjected to increased moisture levels varying from 5 to 75%. Hyperspectral imaging was used to detect and quantify Ni contamination in the soil at different contamination levels and moisture content. IBM SPSS statistical software was used to develop models that predict the concentration of Ni and moisture content in agricultural soil. The models were constructed using linear regression algorithms. The spectral curves obtained reflected an inverse correlation between both Ni concentration and moisture content with respect to reflectance. On the other hand, the models developed resulted in high values of predicted R2 of 0.763 for Ni concentration and 0.854 for moisture content. Those predictions stated that Ni presence was well expressed near 2200 nm and that of moisture was at 1900 nm. The results from this study would allow us to define the potential of using the hyperspectral imaging (HSI) technique as a reliable and cost-effective alternative for heavy metal pollution detection in contaminated soils and soil moisture prediction.Keywords: heavy metals, hyperspectral imaging, moisture content, soil contamination
Procedia PDF Downloads 101550 Interorganizational Relationships in the Brazilian Milk Production Chain
Authors: Marcelo T. Okano, Oduvaldo Vendrametto, Osmildo S. Santos, Marcelo E. Fernandes, Heide Landi
Abstract:
The literature on the interorganizational relationship between companies and organizations has increased in recent years, but there are still doubts about the various settings. The interorganizational networks are important in economic life, the fact facilitate the complex interdependence between transactional and cooperative organizations. A need identified in the literature is the lack of indicators to measure and identify the types of existing networks. The objective of this research is to examine the interorganizational relationships of two milk chains through indicators proposed by the theories of the four authors, characterizing them as network or not and what the benefits obtained by the chain organization. To achieve the objective of this work was carried out a survey of milk producers in two regions of the state of São Paulo. To collect the information needed for the analysis, exploratory research, qualitative nature was used. The research instrument of this work consists of a roadmap of semistructured interviews with open questions. Some of the answers were directed by the interviewer in the form of performance notes aimed at detecting the degree of importance, according to the perception of intensity to that regard. The results showed that interorganizational relationships are small and largely limited to the sale of milk or dairy cooperatives. These relationships relate only to trade relations between the owner and purchaser of milk. But when the producers are organized in associations or networks, interorganizational relationships and increase benefits for all participants in the network. The various visits and interviews in several dairy farms in the regions of São Pau-lo (indicated that the inter-relationships are small and largely limited to the sale of milk to cooperatives or dairy. These relationships refer only to trade relations between the owner and the purchaser of milk. But when the producers are organized in associations or networks, interorganizational relationships increase and bring benefits to all participants in the network.Keywords: interorganizational networks, dairy chain, interorganizational system, São Pau-lo
Procedia PDF Downloads 580549 Domain-Specific Deep Neural Network Model for Classification of Abnormalities on Chest Radiographs
Authors: Nkechinyere Joy Olawuyi, Babajide Samuel Afolabi, Bola Ibitoye
Abstract:
This study collected a preprocessed dataset of chest radiographs and formulated a deep neural network model for detecting abnormalities. It also evaluated the performance of the formulated model and implemented a prototype of the formulated model. This was with the view to developing a deep neural network model to automatically classify abnormalities in chest radiographs. In order to achieve the overall purpose of this research, a large set of chest x-ray images were sourced for and collected from the CheXpert dataset, which is an online repository of annotated chest radiographs compiled by the Machine Learning Research Group, Stanford University. The chest radiographs were preprocessed into a format that can be fed into a deep neural network. The preprocessing techniques used were standardization and normalization. The classification problem was formulated as a multi-label binary classification model, which used convolutional neural network architecture to make a decision on whether an abnormality was present or not in the chest radiographs. The classification model was evaluated using specificity, sensitivity, and Area Under Curve (AUC) score as the parameter. A prototype of the classification model was implemented using Keras Open source deep learning framework in Python Programming Language. The AUC ROC curve of the model was able to classify Atelestasis, Support devices, Pleural effusion, Pneumonia, A normal CXR (no finding), Pneumothorax, and Consolidation. However, Lung opacity and Cardiomegaly had a probability of less than 0.5 and thus were classified as absent. Precision, recall, and F1 score values were 0.78; this implies that the number of False Positive and False Negative is the same, revealing some measure of label imbalance in the dataset. The study concluded that the developed model is sufficient to classify abnormalities present in chest radiographs into present or absent.Keywords: transfer learning, convolutional neural network, radiograph, classification, multi-label
Procedia PDF Downloads 127548 Architectural Engineering and Executive Design: Modelling Procedures, Scientific Tools, Simulation Processing
Authors: Massimiliano Nastri
Abstract:
The study is part of the scientific references on executive design in engineering and architecture, understood as an interdisciplinary field aimed at anticipating and simulating, planning and managing, guiding and instructing construction operations on site. On this basis, the study intends to provide an analysis of a theoretical, methodological, and guiding character aimed at constituting the disciplinary sphere of the executive design, often in the absence of supporting methodological and procedural guidelines in engineering and architecture. The basic methodologies of the study refer to the investigation of the theories and references that can contribute to constituting the scenario of the executive design as the practice of modelling, visualization, and simulation of the construction phases, through the practices of projection of the pragmatic issues of the building. This by proposing a series of references, interrelations, and openings intended to support (for intellectual, procedural, and applicative purposes) the executive definition of the project, aimed at activating the practices of cognitive acquisition and realization intervention within reality.Keywords: modelling and simulation technology, executive design, discretization of the construction, engineering design for building
Procedia PDF Downloads 78547 Rheological Study of Natural Sediments: Application in Filling of Estuaries
Authors: S. Serhal, Y. Melinge, D. Rangeard, F. Hage Chehadeh
Abstract:
Filling of estuaries is an international problem that can cause economic and environmental damage. This work aims the study of the rheological structuring mechanisms of natural sedimentary liquid-solid mixture in estuaries in order to better understand their filling. The estuary of the Rance river, located in Brittany, France is particularly targeted by the study. The aim is to provide answers on the rheological behavior of natural sediments by detecting structural factors influencing the rheological parameters. So we can better understand the fillings estuarine areas and especially consider sustainable solutions of ‘cleansing’ of these areas. The sediments were collected from the trap of Lyvet in Rance estuary. This trap was created by the association COEUR (Comité Opérationnel des Elus et Usagers de la Rance) in 1996 in order to facilitate the cleansing of the estuary. It creates a privileged area for the deposition of sediments and consequently makes the cleansing of the estuary easier. We began our work with a preliminary study to establish the trend of the rheological behavior of the suspensions and to specify the dormant phase which precedes the beginning of the biochemical reactivity of the suspensions. Then we highlight the visco-plastic character at younger age using the Kinexus rheometer, plate-plate geometry. This rheological behavior of suspensions is represented by the Bingham model using dynamic yield stress and viscosity which can be a function of volume fraction, granular extent, and chemical reactivity. The evolution of the viscosity as a function of the solid volume fraction is modeled by the Krieger-Dougherty model. On the other hand, the analysis of the dynamic yield stress showed a fairly functional link with the solid volume fraction.Keywords: estuaries, rheological behavior, sediments, Kinexus rheometer, Bingham model, viscosity, yield stress
Procedia PDF Downloads 160546 Efficient Video Compression Technique Using Convolutional Neural Networks and Generative Adversarial Network
Authors: P. Karthick, K. Mahesh
Abstract:
Video has become an increasingly significant component of our digital everyday contact. With the advancement of greater contents and shows of the resolution, its significant volume poses serious obstacles to the objective of receiving, distributing, compressing, and revealing video content of high quality. In this paper, we propose the primary beginning to complete a deep video compression model that jointly upgrades all video compression components. The video compression method involves splitting the video into frames, comparing the images using convolutional neural networks (CNN) to remove duplicates, repeating the single image instead of the duplicate images by recognizing and detecting minute changes using generative adversarial network (GAN) and recorded with long short-term memory (LSTM). Instead of the complete image, the small changes generated using GAN are substituted, which helps in frame level compression. Pixel wise comparison is performed using K-nearest neighbours (KNN) over the frame, clustered with K-means, and singular value decomposition (SVD) is applied for each and every frame in the video for all three color channels [Red, Green, Blue] to decrease the dimension of the utility matrix [R, G, B] by extracting its latent factors. Video frames are packed with parameters with the aid of a codec and converted to video format, and the results are compared with the original video. Repeated experiments on several videos with different sizes, duration, frames per second (FPS), and quality results demonstrate a significant resampling rate. On average, the result produced had approximately a 10% deviation in quality and more than 50% in size when compared with the original video.Keywords: video compression, K-means clustering, convolutional neural network, generative adversarial network, singular value decomposition, pixel visualization, stochastic gradient descent, frame per second extraction, RGB channel extraction, self-detection and deciding system
Procedia PDF Downloads 187545 Evaluation of IMERG Performance at Estimating the Rainfall Properties through Convective and Stratiform Rain Events in a Semi-Arid Region of Mexico
Authors: Eric Muñoz de la Torre, Julián González Trinidad, Efrén González Ramírez
Abstract:
Rain varies greatly in its duration, intensity, and spatial coverage, it is important to have sub-daily rainfall data for various applications, including risk prevention. However, the ground measurements are limited by the low and irregular density of rain gauges. An alternative to this problem are the Satellite Precipitation Products (SPPs) that use passive microwave and infrared sensors to estimate rainfall, as IMERG, however, these SPPs have to be validated before their application. The aim of this study is to evaluate the performance of the IMERG: Integrated Multi-satellitE Retrievals for Global Precipitation Measurament final run V06B SPP in a semi-arid region of Mexico, using 4 automatic rain gauges (pluviographs) sub-daily data of October 2019 and June to September 2021, using the Minimum inter-event Time (MIT) criterion to separate unique rain events with a dry period of 10 hrs. for the purpose of evaluating the rainfall properties (depth, duration and intensity). Point to pixel analysis, continuous, categorical, and volumetric statistical metrics were used. Results show that IMERG is capable to estimate the rainfall depth with a slight overestimation but is unable to identify the real duration and intensity of the rain events, showing large overestimations and underestimations, respectively. The study zone presented 80 to 85 % of convective rain events, the rest were stratiform rain events, classified by the depth magnitude variation of IMERG pixels and pluviographs. IMERG showed poorer performance at detecting the first ones but had a good performance at estimating stratiform rain events that are originated by Cold Fronts.Keywords: IMERG, rainfall, rain gauge, remote sensing, statistical evaluation
Procedia PDF Downloads 69544 Non-Native and Invasive Fish Species in Poland
Authors: Tomasz Raczyński
Abstract:
Non-native and invasive species negatively transform ecosystems. Non-native fish species can displace native fish species through competition, predation, disrupting spawning, transforming ecosystems, or transmitting parasites. This influence is more and more noticeable in Poland and in the world. From December 2014 to October 2020, did catch of fishes by electrofishing method carried on 416 sites in various parts of Poland. Research was conducted in both running and stagnant freshwaters with the predominance of running waters. Only sites where the presence of fish was found were analysed. The research covered a wide spectrum of waters from small mountain streams, through drainage ditches to the largest Polish river - the Vistula. Single sites covered oxbow lakes, small ponds and lakes. Electrofishing was associated with ichthyofauna inventories and was mainly aimed at detecting protected species of fish and lampreys or included in the annexes to the EU Habitats Directive (Council Directive 92/43/EEC on the Conservation of natural habitats and of wild fauna and flora). The results of these catches were analysed for alien and invasive fish species. The analysis of the catch structure shows that in 71 out of 416 research sites was found alien and invasive fish species, belonging to 9 taxa. According to the above, alien species of fish are present in 17% of the study sites. The most frequently observed species was the Prussian carp Carassius gibelio, which was recorded on 43 sites. Stone moroko Pseudorasbora parva was found on 24 sites. Chinese sleeper Perccottus glenii was found on 6 sites, and Bullhead Ameiurus sp. was also found on 6 sites. Western tubenose goby Proterorhinus semilunaris was found at 5 sites and Rainbow trout Oncorhynchus mykiss at 3 sites. Monkey goby Neogobius fluviatilis, Round goby Neogobius melanostomus and Eurasian carp Cyprinus carpio was recorded on 2 sites.Keywords: non-native species, invasive species, fish species, invasive fish species, native fish species
Procedia PDF Downloads 110543 Multiclass Support Vector Machines with Simultaneous Multi-Factors Optimization for Corporate Credit Ratings
Authors: Hyunchul Ahn, William X. S. Wong
Abstract:
Corporate credit rating prediction is one of the most important topics, which has been studied by researchers in the last decade. Over the last decade, researchers are pushing the limit to enhance the exactness of the corporate credit rating prediction model by applying several data-driven tools including statistical and artificial intelligence methods. Among them, multiclass support vector machine (MSVM) has been widely applied due to its good predictability. However, heuristics, for example, parameters of a kernel function, appropriate feature and instance subset, has become the main reason for the critics on MSVM, as they have dictate the MSVM architectural variables. This study presents a hybrid MSVM model that is intended to optimize all the parameter such as feature selection, instance selection, and kernel parameter. Our model adopts genetic algorithm (GA) to simultaneously optimize multiple heterogeneous design factors of MSVM.Keywords: corporate credit rating prediction, Feature selection, genetic algorithms, instance selection, multiclass support vector machines
Procedia PDF Downloads 294542 Enhancement of Visual Comfort Using Parametric Double Skin Façade
Authors: Ahmed A. Khamis, Sherif A. Ibrahim, Mahmoud El Khatieb, Mohamed A. Barakat
Abstract:
Parametric design is an icon of the modern architectural that facilitate taking complex design decisions counting on altering various design parameters. Double skin facades are one of the parametric applications for using parametric designs. This paper opts to enhance different daylight parameters of a selected case study office building in Cairo using parametric double skin facade. First, the design and optimization process executed utilizing Grasshopper parametric design software which is a plugin in rhino. The daylighting performance of the base case building model was compared with the one used the double façade showing an enhancement in daylighting performance indicators like glare and task illuminance in the modified model, execution drawings are made for the optimized design to be executed through Revit, followed by computerized digital fabrication stages of the designed model with various scales to reach the final design decisions using Simplify 3D for mock-up digital fabricationKeywords: parametric design, double skin facades, digital fabrication, grasshopper, simplify 3D
Procedia PDF Downloads 118541 The Guide Presentation: The Grand Palace
Authors: Nuchanat Handumrongkul Danaya Darnsawasdi, Anantachai Aeka
Abstract:
To be a model for performing oral presentations by the tour guides, this research has been conducted. In order to develop French language teaching and studying for tourism, its purpose is to analyze the content used by tour guides. The study employed audio recordings of these presentations as an interview method in authentic situations, having four guides as respondents and information providers. The data was analyzed through content analysis. The results found that the tour guides described eight important items by giving more importance to details at Wat Phra Kaew or the Temple of the Emerald Buddha than at the palaces. They preferred the buildings upon the upper terrace, Buddhist cosmology, the decoration techniques, the royal chapel, the mural paintings, Thai offerings to Buddha images, palaces with architectural features and functions including royal ceremonies and others. This information represents the Thai characteristics of each building and other related content. The findings were used as a manual for guides for how to describe a tourist attraction, especially the temple and other related cultural topics of interest.Keywords: guide, guide presentation, Grand Palace, Buddhist cosmology
Procedia PDF Downloads 500540 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection
Authors: Hamidullah Binol, Abdullah Bal
Abstract:
Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.Keywords: food (ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods
Procedia PDF Downloads 431539 Compatibility of Disabilities for a Single Workplace through Mobile Technology: A Case Study in Brazilian Industries
Authors: Felyppe Blum Goncalves, Juliana Sebastiany
Abstract:
In line with Brazilian legislation on the inclusion of persons with disabilities in the world of work, known as the 'quota law' (Law 8213/91) and in accordance with the prerogatives of the United Nations Convention on Human Rights of people with disabilities, which was ratified by Brazil through Federal Decree No. 6.949 of August 25, 2009, the SESI National Department, through Working Groups, structured the product Affordable Industry. This methodology aims to prepare the industries for the adequate process of inclusion of people with disabilities, as well as the development of an organizational culture that values and respects human diversity. All industries in Brazil with 100 or more employees must comply with current legislation, but due to the lack of information and guidance on the subject, they end up having difficulties in this process. The methodology brings solutions for companies through the professional qualification of the disabled person, preparation of managers, training of human resources teams and employees. It also advocates the survey of the architectural accessibility of the factory and the identification of the possibilities of inclusion of people with disabilities, through the compatibility between work and job requirements, preserving safety, health, and quality of life.Keywords: inclusion, app, disability, management
Procedia PDF Downloads 163538 Analysis of a IncResU-Net Model for R-Peak Detection in ECG Signals
Authors: Beatriz Lafuente Alcázar, Yash Wani, Amit J. Nimunkar
Abstract:
Cardiovascular Diseases (CVDs) are the leading cause of death globally, and around 80% of sudden cardiac deaths are due to arrhythmias or irregular heartbeats. The majority of these pathologies are revealed by either short-term or long-term alterations in the electrocardiogram (ECG) morphology. The ECG is the main diagnostic tool in cardiology. It is a non-invasive, pain free procedure that measures the heart’s electrical activity and that allows the detecting of abnormal rhythms and underlying conditions. A cardiologist can diagnose a wide range of pathologies based on ECG’s form alterations, but the human interpretation is subjective and it is contingent to error. Moreover, ECG records can be quite prolonged in time, which can further complicate visual diagnosis, and deeply retard disease detection. In this context, deep learning methods have risen as a promising strategy to extract relevant features and eliminate individual subjectivity in ECG analysis. They facilitate the computation of large sets of data and can provide early and precise diagnoses. Therefore, the cardiology field is one of the areas that can most benefit from the implementation of deep learning algorithms. In the present study, a deep learning algorithm is trained following a novel approach, using a combination of different databases as the training set. The goal of the algorithm is to achieve the detection of R-peaks in ECG signals. Its performance is further evaluated in ECG signals with different origins and features to test the model’s ability to generalize its outcomes. Performance of the model for detection of R-peaks for clean and noisy ECGs is presented. The model is able to detect R-peaks in the presence of various types of noise, and when presented with data, it has not been trained. It is expected that this approach will increase the effectiveness and capacity of cardiologists to detect divergences in the normal cardiac activity of their patients.Keywords: arrhythmia, deep learning, electrocardiogram, machine learning, R-peaks
Procedia PDF Downloads 186537 Study on the Demolition Waste Management in Malaysia Construction Industry
Authors: Gunalan Vasudevan
Abstract:
The Malaysia construction industry generates a large quantity of construction and demolition waste nowadays. In the handbook for demolition work only comprised small portion of demolition waste management. It is important to study and determine the ways to provide a practical guide for the professional in the building industry about handling the demolition waste. In general, demolition defined as tearing down or wrecking of structural work or architectural work of the building and other infrastructures work such as road, bridge and etc. It’s a common misconception that demolition is nothing more than taking down a structure and carrying the debris to a landfill. On many projects, 80-90% of the structure is kept for reuse or recycling which help the owner to save cost. Demolition contractors required a lot of knowledge and experience to minimize the impact of demolition work to the existing surrounding area. For data collecting method, postal questionnaires and interviews have been selected to collect data. Questionnaires have distributed to 80 respondents from the construction industry in Klang Valley. 67 of 80 respondents have replied the questionnaire while 4 people have interviewed. Microsoft Excel and Statistical Package for Social Science version 17.0 were used to analyze the data collected.Keywords: demolition, waste management, construction material, Malaysia
Procedia PDF Downloads 443536 Enhancement of Road Defect Detection Using First-Level Algorithm Based on Channel Shuffling and Multi-Scale Feature Fusion
Authors: Yifan Hou, Haibo Liu, Le Jiang, Wandong Su, Binqing Wang
Abstract:
Road defect detection is crucial for modern urban management and infrastructure maintenance. Traditional road defect detection methods mostly rely on manual labor, which is not only inefficient but also difficult to ensure their reliability. However, existing deep learning-based road defect detection models have poor detection performance in complex environments and lack robustness to multi-scale targets. To address this challenge, this paper proposes a distinct detection framework based on the one stage algorithm network structure. This article designs a deep feature extraction network based on RCSDarknet, which applies channel shuffling to enhance information fusion between tensors. Through repeated stacking of RCS modules, the information flow between different channels of adjacent layer features is enhanced to improve the model's ability to capture target spatial features. In addition, a multi-scale feature fusion mechanism with weighted dual flow paths was adopted to fuse spatial features of different scales, thereby further improving the detection performance of the model at different scales. To validate the performance of the proposed algorithm, we tested it using the RDD2022 dataset. The experimental results show that the enhancement algorithm achieved 84.14% mAP, which is 1.06% higher than the currently advanced YOLOv8 algorithm. Through visualization analysis of the results, it can also be seen that our proposed algorithm has good performance in detecting targets of different scales in complex scenes. The above experimental results demonstrate the effectiveness and superiority of the proposed algorithm, providing valuable insights for advancing real-time road defect detection methods.Keywords: roads, defect detection, visualization, deep learning
Procedia PDF Downloads 6535 Information Visualization Methods Applied to Nanostructured Biosensors
Authors: Osvaldo N. Oliveira Jr.
Abstract:
The control of molecular architecture inherent in some experimental methods to produce nanostructured films has had great impact on devices of various types, including sensors and biosensors. The self-assembly monolayers (SAMs) and the electrostatic layer-by-layer (LbL) techniques, for example, are now routinely used to produce tailored architectures for biosensing where biomolecules are immobilized with long-lasting preserved activity. Enzymes, antigens, antibodies, peptides and many other molecules serve as the molecular recognition elements for detecting an equally wide variety of analytes. The principles of detection are also varied, including electrochemical methods, fluorescence spectroscopy and impedance spectroscopy. In this presentation an overview will be provided of biosensors made with nanostructured films to detect antibodies associated with tropical diseases and HIV, in addition to detection of analytes of medical interest such as cholesterol and triglycerides. Because large amounts of data are generated in the biosensing experiments, use has been made of computational and statistical methods to optimize performance. Multidimensional projection techniques such as Sammon´s mapping have been shown more efficient than traditional multivariate statistical analysis in identifying small concentrations of anti-HIV antibodies and for distinguishing between blood serum samples of animals infected with two tropical diseases, namely Chagas´ disease and Leishmaniasis. Optimization of biosensing may include a combination of another information visualization method, the Parallel Coordinate technique, with artificial intelligence methods in order to identify the most suitable frequencies for reaching higher sensitivity using impedance spectroscopy. Also discussed will be the possible convergence of technologies, through which machine learning and other computational methods may be used to treat data from biosensors within an expert system for clinical diagnosis.Keywords: clinical diagnosis, information visualization, nanostructured films, layer-by-layer technique
Procedia PDF Downloads 337534 Interaction of Histone H1 with Chromatin-associated Protein HMGB1 Studied by Microscale Thermophoresis
Authors: Michal Štros, Eva Polanská, Šárka Pospíšilová
Abstract:
HMGB1 is an architectural protein in chromatin, acting also as a signaling molecule outside the cell. Recent reports from several laboratories provided evidence that a number of both the intracellular and extracellular functions of HMGB1 may depend on redox-sensitive cysteine residues of the protein. MALDI-TOF analysis revealed that mild oxidization of HMGB1 resulted in a conformational change of the protein due to formation of an intramolecular disulphide bond by opposing Cys23 and Cys45 residues. We have demonstrated that redox state of HMGB1 could significantly modulate the ability of the protein to bind and bend DNA. We have also shown that reduced HMGB1 could easily displace histone H1 from DNA, while oxidized HMGB1 had limited capacity for H1 displacement. Using microscale thermophoresis (MST) we have further studied mechanism of HMGB1 interaction with histone H1 in free solution or when histone H1 was bound to DNA. Our MST analysis indicated that reduced HMGB1 exhibited in free solution > 1000 higher affinity of for H1 (KD ~ 4.5 nM) than oxidized HMGB1 (KD <10 M). Finally, we present a novel mechanism for the HMGB1-mediated modulation of histone H1 binding to DNA.Keywords: HMGB1, histone H1, redox state, interaction, cross-linking, DNA bending, DNA end-joining, microscale thermophoresis
Procedia PDF Downloads 334533 Impacts of Building Design Factors on Auckland School Energy Consumptions
Authors: Bin Su
Abstract:
This study focuses on the impact of school building design factors on winter extra energy consumption which mainly includes space heating, water heating and other appliances related to winter indoor thermal conditions. A number of Auckland schools were randomly selected for the study which introduces a method of using real monthly energy consumption data for a year to calculate winter extra energy data of school buildings. The study seeks to identify the relationships between winter extra energy data related to school building design data related to the main architectural features, building envelope and elements of the sample schools. The relationships can be used to estimate the approximate saving in winter extra energy consumption which would result from a changed design datum for future school development, and identify any major energy-efficient design problems. The relationships are also valuable for developing passive design guides for school energy efficiency.Keywords: building energy efficiency, building thermal design, building thermal performance, school building design
Procedia PDF Downloads 442532 Associations of the FTO Gene Polymorphism with Obesity and Metabolic Syndrome in Lithuanian Adult Population
Authors: Alina Smalinskiene Janina Petkeviciene, Jurate Klumbiene, Vilma Kriaucioniene, Vaiva Lesauskaite
Abstract:
The worldwide prevalence of obesity has been increasing dramatically in the last few decades, and Lithuania is no exception. In 2012, every fifth adult (19% of men and 20.5 % of women) was obese and every third was overweight Association studies have highlighted the influence of SNPs in obesity, with particular focus on FTO rs9939609. Thus far, no data on the possible association of this SNP to obesity in the adult Lithuanian population has been reported. Here, for the first time, we demonstrate an association between the FTO rs9939609 homozygous AA genotype and increased BMI when compared to homozygous TT. Furthermore, a positive association was determined between the FTO rs9939609 variant and risk of metabolic syndrome. Background: This study aimed to examine the associations between the fat mass and obesity associated (FTO) gene rs9939609 variant with obesity and metabolic syndrome in Lithuanian adult population. Materials and Methods: A cross-sectional health survey was carried out in randomly selected municipalities of Lithuania. The random sample was obtained from lists of 25–64 year-old inhabitants. The data from 1020 subjects were analysed. The rs9939609 SNP of the FTO gene was assessed using TaqMan assays (Applied Biosystems, Foster City, CA, USA). The Applied Biosystems 7900HT Real-Time Polymerase Chain Reaction System was used for detecting the SNPs. Results: The carriers of the AA genotype had the highest mean values of BMI and waist circumference (WC) and the highest risk of obesity. Interactions ‘genotype x age’ and ‘genotype x physical activity’ in determining BMI and WC were shown. Neither lipid and glucose levels, nor blood pressure were associated with the rs9939609 independently of BMI. In the age group of 25-44 years, association between the FTO genotypes and metabolic syndrome was found. Conclusion: The FTO rs9939609 variant was significantly associated with BMI and WC, and with the risk of obesity in Lithuanian population. The FTO polymorphism might have a greater influence on weight status in younger individuals and in subjects with a low level of physical activity.Keywords: obesity metabolic syndrome, FTO gene, polymorphism, Lithuania
Procedia PDF Downloads 430531 Using Hyperspectral Sensor and Machine Learning to Predict Water Potentials of Wild Blueberries during Drought Treatment
Authors: Yongjiang Zhang, Kallol Barai, Umesh R. Hodeghatta, Trang Tran, Vikas Dhiman
Abstract:
Detecting water stress on crops early and accurately is crucial to minimize its impact. This study aims to measure water stress in wild blueberry crops non-destructively by analyzing proximal hyperspectral data. The data collection took place in the summer growing season of 2022. A drought experiment was conducted on wild blueberries in the randomized block design in the greenhouse, incorporating various genotypes and irrigation treatments. Hyperspectral data ( spectral range: 400-1000 nm) using a handheld spectroradiometer and leaf water potential data using a pressure chamber were collected from wild blueberry plants. Machine learning techniques, including multiple regression analysis and random forest models, were employed to predict leaf water potential (MPa). We explored the optimal wavelength bands for simple differences (RY1-R Y2), simple ratios (RY1/RY2), and normalized differences (|RY1-R Y2|/ (RY1-R Y2)). NDWI ((R857 - R1241)/(R857 + R1241)), SD (R2188 – R2245), and SR (R1752 / R1756) emerged as top predictors for predicting leaf water potential, significantly contributing to the highest model performance. The base learner models achieved an R-squared value of approximately 0.81, indicating their capacity to explain 81% of the variance. Research is underway to develop a neural vegetation index (NVI) that automates the process of index development by searching for specific wavelengths in the space ratio of linear functions of reflectance. The NVI framework could work across species and predict different physiological parameters.Keywords: hyperspectral reflectance, water potential, spectral indices, machine learning, wild blueberries, optimal bands
Procedia PDF Downloads 67530 Analytical Model of Multiphase Machines Under Electrical Faults: Application on Dual Stator Asynchronous Machine
Authors: Nacera Yassa, Abdelmalek Saidoune, Ghania Ouadfel, Hamza Houassine
Abstract:
The rapid advancement in electrical technologies has underscored the increasing importance of multiphase machines across various industrial sectors. These machines offer significant advantages in terms of efficiency, compactness, and reliability compared to their single-phase counterparts. However, early detection and diagnosis of electrical faults remain critical challenges to ensure the durability and safety of these complex systems. This paper presents an advanced analytical model for multiphase machines, with a particular focus on dual stator asynchronous machines. The primary objective is to develop a robust diagnostic tool capable of effectively detecting and locating electrical faults in these machines, including short circuits, winding faults, and voltage imbalances. The proposed methodology relies on an analytical approach combining electrical machine theory, modeling of magnetic and electrical circuits, and advanced signal analysis techniques. By employing detailed analytical equations, the developed model accurately simulates the behavior of multiphase machines in the presence of electrical faults. The effectiveness of the proposed model is demonstrated through a series of case studies and numerical simulations. In particular, special attention is given to analyzing the dynamic behavior of machines under different types of faults, as well as optimizing diagnostic and recovery strategies. The obtained results pave the way for new advancements in the field of multiphase machine diagnostics, with potential applications in various sectors such as automotive, aerospace, and renewable energies. By providing precise and reliable tools for early fault detection, this research contributes to improving the reliability and durability of complex electrical systems while reducing maintenance and operation costs.Keywords: faults, diagnosis, modelling, multiphase machine
Procedia PDF Downloads 63529 Implementation of Student-Centered Learning Approach in Building Surveying Course
Authors: Amal A. Abdel-Sattar
Abstract:
The curriculum of architecture department in Prince Sultan University includes ‘Building Surveying’ course which is usually a part of civil engineering courses. As a fundamental requirement of the course, it requires a strong background in mathematics and physics, which are not usually preferred subjects to the architecture students and many of them are not giving the required and necessary attention to these courses during their preparation year before commencing their architectural study. This paper introduces the concept and the methodology of the student-centered learning approach in the course of building surveying for architects. One of the major outcomes is the improvement in the students’ involvement in the course and how this will cover and strength their analytical weak points and improve their mathematical skills. The study is conducted through three semesters with a total number of 99 students. The effectiveness of the student-centered learning approach is studied using the student survey at the end of each semester and teacher observations. This survey showed great acceptance of the students for these methods. Also, the teachers observed a great improvement in the students’ mathematical abilities and how keener they became in attending the classes which were clearly reflected on the low absence record.Keywords: architecture, building surveying, student-centered learning, teaching and learning
Procedia PDF Downloads 251528 Terahertz Glucose Sensors Based on Photonic Crystal Pillar Array
Authors: S. S. Sree Sanker, K. N. Madhusoodanan
Abstract:
Optical biosensors are dominant alternative for traditional analytical methods, because of their small size, simple design and high sensitivity. Photonic sensing method is one of the recent advancing technology for biosensors. It measures the change in refractive index which is induced by the difference in molecular interactions due to the change in concentration of the analyte. Glucose is an aldosic monosaccharide, which is a metabolic source in many of the organisms. The terahertz waves occupies the space between infrared and microwaves in the electromagnetic spectrum. Terahertz waves are expected to be applied to various types of sensors for detecting harmful substances in blood, cancer cells in skin and micro bacteria in vegetables. We have designed glucose sensors using silicon based 1D and 2D photonic crystal pillar arrays in terahertz frequency range. 1D photonic crystal has rectangular pillars with height 100 µm, length 1600 µm and width 50 µm. The array period of the crystal is 500 µm. 2D photonic crystal has 5×5 cylindrical pillar array with an array period of 75 µm. Height and diameter of the pillar array are 160 µm and 100 µm respectively. Two samples considered in the work are blood and glucose solution, which are labelled as sample 1 and sample 2 respectively. The proposed sensor detects the concentration of glucose in the samples from 0 to 100 mg/dL. For this, the crystal was irradiated with 0.3 to 3 THz waves. By analyzing the obtained S parameter, the refractive index of the crystal corresponding to the particular concentration of glucose was measured using the parameter retrieval method. Refractive indices of the two crystals decreased gradually with the increase in concentration of glucose in the sample. For 1D photonic crystals, a gradual decrease in refractive index was observed at 1 THz. 2D photonic crystal showed this behavior at 2 THz. The proposed sensor was simulated using CST Microwave studio. This will enable us to develop a model which can be used to characterize a glucose sensor. The present study is expected to contribute to blood glucose monitoring.Keywords: CST microwave studio, glucose sensor, photonic crystal, terahertz waves
Procedia PDF Downloads 281527 Studyt on New Strategies of Sustainable Neighbourhood Design Based on the 2014 Waf
Authors: Zhou Xiaowen China, Zhang Sanming China
Abstract:
Neighbourhood space as a very important part of city spaces, is an organic combination of material environment and spiritual achievement in people’ daily life, and has a real impact upon the sustainable development of the whole city. Looking back on the past 2014 World Architecture Festival (WAF), 4 out of 35winning buildings were neighbourhood designs, and all of them mentioned about space-sharing and sustainable development. In this paper, three award-winning cases were studied, including the world building of the year—the chapel (Vietnam, A21 studio), The Carve (Norway, A-Lab) and House for Trees (Vietnam, Vo Trong Nghia Architects). Urban context, planning, space construction and sustainable technology were discussed. Based on those, it was discovered that passive energy-saving technologies have been paid more and more attention, sharing space has been designed ingeniously, and the architectural forms of them reflect social inclusion and equity. This paper is aimed at summarizing the excellent works on the Festival and providing reference for the future design.Keywords: neighbourhood design, 2014 World Architecture Festival (WAF), sustainable development, space-sharing
Procedia PDF Downloads 444