Search results for: Philip Cloud
175 Solar and Wind Energy Potential Study of Lower Sindh, Pakistan for Power Generation
Authors: M. Akhlaque Ahmed, Sidra A. Shaikh, Maliha A. Siddiqui
Abstract:
Global and diffuse solar radiation on horizontal surface of Lower Sindh, namely Karachi, Hyderabad, Nawabshah were carried out using sunshine hour data of the area to assess the feasibility of solar energy utilization for power generation in Sindh province. The results obtained show a large variation in the direct and diffuse component of solar radiation in summer and winter months in Lower Sindh (50% direct and 50% diffuse for Karachi and Hyderabad). In Nawabshah area, the contribution of diffuse solar radiation is low during the monsoon months, July and August. The KT value of Nawabshah indicates a clear sky throughout almost the entire year. The percentage of diffuse radiation does not exceed more than 20%. In Nawabshah, the appearance of cloud is rare even during the monsoon months. The estimated values indicate that Nawabshah has high solar potential, whereas Karachi and Hyderabad have low solar potential. During the monsoon months the Lower part of Sindh can utilize the hybrid system with wind power. Near Karachi and Hyderabad, the wind speed ranges between 6.2 m/sec to 6.9 m/sec. A wind corridor exists near Karachi, Hyderabad, Gharo, Keti Bander and Shah Bander. The short fall of solar can be compensated by wind because in the monsoon months of July and August, wind speeds are higher in the Lower region of Sindh.Keywords: hybrid power system, lower Sindh, power generation, solar and wind energy potential
Procedia PDF Downloads 252174 Solar and Wind Energy Potential Study of Sindh Province, Pakistan for Power Generation
Authors: M. Akhlaque Ahmed, Sidra A. Shaikh, Maliha A. Siddiqui, Adeel Tahir
Abstract:
Global and diffuse solar radiation on horizontal surface of southern sindh namely Karachi, Hyderabad, Nawabshah were carried out using sunshine hour data of the area to asses the feasibility of solar Energy utilization at Sindh province for power generation. From the observation, result is derived which shows a drastic variation in the diffuse and direct component of solar radiation for summer and winter for Southern Sindh that is both contributes 50% for Karachi and Hyderabad. In Nawabshah area, the contribution of diffuse solar radiation is low in monsoon months, July and August. The Kᴛ value of Nawabshah indicates a clear sky almost throughout the year. The percentage of diffuse radiation does not exceed more than 20%. In Nawabshah, the appearance of cloud is rare even in monsoon months. The estimated values indicate that Nawabshah has high solar potential whereas Karachi and Hyderabad has low solar potential. During the monsoon months, the southern part of Sind can utilize the hybrid system with wind power. Near Karachi and Hyderabad, the wind speed ranges between 6.2 to 6.9 m/sec. There exist a wind corridor near Karachi, Hyderabad, Gharo, Keti Bander and Shah Bander. The short fall of solar can be compensated by wind because in monsoon months July and August the wind speed are higher in the southern region of Sindh.Keywords: hybrid power system, power generation, solar and wind energy potential, southern Sindh
Procedia PDF Downloads 236173 A Unified Webcam Proctoring Solution on Edge
Authors: Saw Thiha, Jay Rajasekera
Abstract:
A boom in video conferencing generated millions of hours of video data daily to be analyzed. However, such enormous data pose certain scalability issues to be analyzed efficiently, let alone do it in real-time, as online conferences can involve hundreds of people and can last for hours. This paper proposes an efficient online proctoring solution that can analyze the online conferences real-time on edge devices such as Android, iOS, and desktops. Since the computation can be done upfront on the devices where online conferences take place, it can scale well without requiring intensive resources such as GPU servers and complex cloud infrastructure. According to the linear models, face orientation does indeed impact the perceived eye openness. Also, the proposed z score facial landmark standardization was proven to be functional in detecting face orientation and contributed to classifying eye blinks with single eyelid distance computation while achieving a better f1 score and accuracy than the Eye Aspect Ratio (EAR) threshold method. Last but not least, the authors implemented the solution natively in the MediaPipe framework and open-sourced it along with the reproducible experimental results on GitHub. The solution provides face orientation, eye blink, facial activity, and translation detections out of the box and is highly customizable and extensible.Keywords: android, desktop, edge computing, blink, face orientation, facial activity and translation, MediaPipe, open source, real-time, video conference, web, iOS, Z score facial landmark standardization
Procedia PDF Downloads 97172 Artificial Intelligence for Generative Modelling
Authors: Shryas Bhurat, Aryan Vashistha, Sampreet Dinakar Nayak, Ayush Gupta
Abstract:
As the technology is advancing more towards high computational resources, there is a paradigm shift in the usage of these resources to optimize the design process. This paper discusses the usage of ‘Generative Design using Artificial Intelligence’ to build better models that adapt the operations like selection, mutation, and crossover to generate results. The human mind thinks of the simplest approach while designing an object, but the intelligence learns from the past & designs the complex optimized CAD Models. Generative Design takes the boundary conditions and comes up with multiple solutions with iterations to come up with a sturdy design with the most optimal parameter that is given, saving huge amounts of time & resources. The new production techniques that are at our disposal allow us to use additive manufacturing, 3D printing, and other innovative manufacturing techniques to save resources and design artistically engineered CAD Models. Also, this paper discusses the Genetic Algorithm, the Non-Domination technique to choose the right results using biomimicry that has evolved for current habitation for millions of years. The computer uses parametric models to generate newer models using an iterative approach & uses cloud computing to store these iterative designs. The later part of the paper compares the topology optimization technology with Generative Design that is previously being used to generate CAD Models. Finally, this paper shows the performance of algorithms and how these algorithms help in designing resource-efficient models.Keywords: genetic algorithm, bio mimicry, generative modeling, non-dominant techniques
Procedia PDF Downloads 149171 The Lived Experience of Pregnant Saudi Women Carrying a Fetus with Structural Abnormalities
Authors: Nasreen Abdulmannan
Abstract:
Fetal abnormalities are categorized as a structural abnormality, non-structural abnormality, or a combination of both. Fetal structural abnormalities (FSA) include, but are not limited, to Down syndrome, congenital diaphragmatic hernia, and cleft lip and palate. These abnormalities can be detected in the first weeks of pregnancy, which is almost around 9 - 20 weeks gestational. Etiological factors for FSA are unknown; however, transmitted genetic risk can be one of these factors. Consanguineous marriage often referred to as inbreeding, represents a significant risk factor for FSA due to the increased likelihood of deleterious genetic traits shared by both biological parents. In a country such as the Kingdom of Saudi Arabia (KSA), consanguineous marriage is high, which creates a significant risk of children being born with congenital abnormalities. Historically, the practice of consanguinity occurred commonly among European royalty. For example, Great Britain’s Queen Victoria married her German first cousin, Prince Albert of Coburg. Although a distant blood relationship, the United Kingdom’s Queen Elizabeth II married her cousin, Prince Philip of Greece and Denmark—both of them direct descendants of Queen Victoria. In Middle Eastern countries, a high incidence of consanguineous unions still exists, including in the KSA. Previous studies indicated that a significant gap exists in understanding the lived experiences of Saudi women dealing with an FSA-complicated pregnancy. Eleven participants were interviewed using a semi-structured interview format for this qualitative phenomenological study investigating the lived experiences of pregnant Saudi women carrying a child with FSA. This study explored the gaps in current literature regarding the lived experiences of pregnant Saudi women whose pregnancies were complicated by FSA. In addition, the researcher acquired knowledge about the available support and resources as well as the Saudi cultural perspective on FSA. This research explored the lived experiences of pregnant Saudi women utilizing Giorgi’s (2009) approach to data collection and data management. Findings for this study cover five major themes: (1) initial maternal reaction to the FSA diagnosis per ultrasound screening; (2) strengthening of the maternal relationship with God; (3) maternal concern for their child’s future; (4) feeling supported by their loved ones; and (5) lack of healthcare provider support and guidance. Future research in the KSA is needed to explore the network support for these mothers. This study recommended further clinical nursing research, nursing education, clinical practice, and healthcare policy/procedures to provide opportunities for improvement in nursing care and increase awareness in KSA society.Keywords: fetal structural abnormalities, psychological distress, health provider, health care
Procedia PDF Downloads 155170 Approaches to Ethical Hacking: A Conceptual Framework for Research
Authors: Lauren Provost
Abstract:
The digital world remains increasingly vulnerable, making the development of effective cybersecurity approaches even more critical in supporting the success of the digital economy and national security. Although approaches to cybersecurity have shifted and improved in the last decade with new models, especially with cloud computing and mobility, a record number of high severity vulnerabilities were recorded in the National Institute of Standards and Technology (NIST), and its National Vulnerability Database (NVD) in 2020. This is due, in part, to the increasing complexity of cyber ecosystems. Security must be approached with a more comprehensive, multi-tool strategy that addresses the complexity of cyber ecosystems, including the human factor. Ethical hacking has emerged as such an approach: a more effective, multi-strategy, comprehensive approach to cyber security's most pressing needs, especially understanding the human factor. Research on ethical hacking, however, is limited in scope. The two main objectives of this work are to (1) provide highlights of case studies in ethical hacking, (2) provide a conceptual framework for research in ethical hacking that embraces and addresses both technical and nontechnical security measures. Recommendations include an improved conceptual framework for research centered on ethical hacking that addresses many factors and attributes of significant attacks that threaten computer security; a more robust, integrative multi-layered framework embracing the complexity of cybersecurity ecosystems.Keywords: ethical hacking, literature review, penetration testing, social engineering
Procedia PDF Downloads 218169 The Burmese Exodus of 1942: Towards Evolving Policy Protocols for a Refugee Archive
Authors: Vinod Balakrishnan, Chrisalice Ela Joseph
Abstract:
The Burmese Exodus of 1942, which left more than 4 lakh as refugees and thousands dead, is one of the worst forced migrations in recorded history. Adding to the woes of the refugees is the lack of credible documentation of their lived experiences, trauma, and stories and their erasure from recorded history. Media reports, national records, and mainstream narratives that have registered the exodus provide sanitized versions which have reduced the refugees to a nameless, faceless mass of travelers and obliterated their lived experiences, trauma, and sufferings. This attitudinal problem compels the need to stem the insensitivity that accompanies institutional memory by making a case for a more humanistically evolved policy that puts in place protocols for the way the humanities would voice the concern for the refugee. A definite step in this direction and a far more relevant project in our times is the need to build a comprehensive refugee archive that can be a repository of the refugee experiences and perspectives. The paper draws on Hannah Arendt’s position on the Jewish refugee crisis, Agamben’s work on statelessness and citizenship, Foucault’s notion of governmentality and biopolitics, Edward Said’s concepts on Exile, Fanon’s work on the dispossessed, Derrida’s work on ‘the foreigner and hospitality’ in order to conceptualize the refugee condition which will form the theoretical framework for the paper. It also refers to the existing scholarship in the field of refugee studies such as Roger Zetter’s work on the ‘refugee label’, Philip Marfleet’s work on ‘refugees and history’, Lisa Malkki’s research on the anthropological discourse of the refugee and refugee studies. The paper is also informed by the work that has been done by the international organizations to address the refugee crisis. The emphasis is on building a strong argument for the establishment of the refugee archive that finds but a passing and a none too convincing reference in refugee studies in order to enable a multi-dimensional understanding of the refugee crisis. Some of the old questions cannot be dismissed as outdated as the continuing travails of the refugees in different parts of the world only remind us that they are still, largely, unanswered. The questions are -What is the nature of a Refugee Archive? How is it different from the existing historical and political archives? What are the implications of the refugee archive? What is its contribution to refugee studies? The paper draws on Diana Taylor’s concept of the archive and the repertoire to theorize the refugee archive as a repository that has the documentary function of the ‘archive’ and the ‘agency’ function of the repertoire. It then reads Ayya’s Accounts- a memoir by Anand Pandian -in the light of Hannah Arendt’s concepts of the ‘refugee as vanguard’ and ‘story telling as political action’- to illustrate how the memoir contributes to the refugee archive that provides the refugee a place and agency in history. The paper argues for a refugee archive that has implications for the formulation of inclusive refugee policies.Keywords: Ayya’s Accounts, Burmese Exodus, policy protocol, refugee archive
Procedia PDF Downloads 140168 Improving Temporal Correlations in Empirical Orthogonal Function Expansions for Data Interpolating Empirical Orthogonal Function Algorithm
Authors: Ping Bo, Meng Yunshan
Abstract:
Satellite-derived sea surface temperature (SST) is a key parameter for many operational and scientific applications. However, the disadvantage of SST data is a high percentage of missing data which is mainly caused by cloud coverage. Data Interpolating Empirical Orthogonal Function (DINEOF) algorithm is an EOF-based technique for reconstructing the missing data and has been widely used in oceanographic field. The reconstruction of SST images within a long time series using DINEOF can cause large discontinuities and one solution for this problem is to filter the temporal covariance matrix to reduce the spurious variability. Based on the previous researches, an algorithm is presented in this paper to improve the temporal correlations in EOF expansion. Similar with the previous researches, a filter, such as Laplacian filter, is implemented on the temporal covariance matrix, but the temporal relationship between two consecutive images which is used in the filter is considered in the presented algorithm, for example, two images in the same season are more likely correlated than those in the different seasons, hence the latter one is less weighted in the filter. The presented approach is tested for the monthly nighttime 4-km Advanced Very High Resolution Radiometer (AVHRR) Pathfinder SST for the long-term period spanning from 1989 to 2006. The results obtained from the presented algorithm are compared to those from the original DINEOF algorithm without filtering and from the DINEOF algorithm with filtering but without taking temporal relationship into account.Keywords: data interpolating empirical orthogonal function, image reconstruction, sea surface temperature, temporal filter
Procedia PDF Downloads 324167 3D Modelling and Numerical Analysis of Human Inner Ear by Means of Finite Elements Method
Authors: C. Castro-Egler, A. Durán-Escalante, A. García-González
Abstract:
This paper presents a method to generate a finite element model of the human auditory inner ear system. The geometric model has been realized using 2D images from a virtual model of temporal bones. A point cloud has been gotten manually from those images to construct a whole mesh with hexahedral elements. The main difference with the predecessor models is the spiral shape of the cochlea with its three scales completely defined: scala tympani, scala media and scala vestibuli; which are separate by basilar membrane and Reissner membrane. To validate this model, numerical simulations have been realised with two models: an isolated inner ear and a whole model of human auditory system. Ideal conditions of displacement are applied over the oval window in the isolated Inner Ear model. The whole model is made up of the outer auditory channel, the tympani, the ossicular chain, and the inner ear. The boundary condition for the whole model is 1Pa over the auditory channel entrance. The numerical simulations by FEM have been done using a harmonic analysis with a frequency range between 100-10.000 Hz with an interval of 100Hz. The following results have been carried out: basilar membrane displacement; the scala media pressure according to the cochlea length and the transfer function of the middle ear normalized with the pressure in the tympanic membrane. The basilar membrane displacements and the pressure in the scala media make it possible to validate the response in frequency of the basilar membrane.Keywords: finite elements method, human auditory system model, numerical analysis, 3D modelling cochlea
Procedia PDF Downloads 362166 Global and Diffuse Solar Radiation Studies over Seven Cities of Sindh, Pakistan for Power Generation
Authors: M. A. Ahmed, Sidra A. Shaik
Abstract:
Global and diffuse solar radiation on horizontal surface over seven cities of Sindh namely Karachi, Hyderabad, Chore, Padidan, Nawabshah, Rohri and Jacobabad were carried out using sunshine hour data of the area to assess the feasibility of solar energy utilization at Sindh province. The result obtained shows a variation of direct and diffuse component of solar radiation in summer and winter months in southern Sindh (50% direct and 50% diffuse for Karachi, and Hyderabad) where there is a large variation in direct and diffuse component of solar radiation in summer and winter months in northern region (80% direct and 20% diffuse for Rohri and Jacobabad). In southern Sindh, the contribution of diffuse solar radiation is higher during the monsoon months (July and August). The sky remains clear during September to June. In northern Sindh (Rohri and Jacobabad) the contribution of diffuse solar radiation is low even in monsoon months i,e in July and August. The Kt value for northern Sindh indicates a clear sky. In northern part of the Sindh percentage of diffuse radiation does not exceed more than 20%. The appearance of cloud is rare. From the point of view of power generation, the estimated values indicate that northern part of Sindh has high solar potential while the southern part has low solar potential.Keywords: global and diffuse solar radiation, solar potential, Province of Sindh, solar radiation studies for power generation
Procedia PDF Downloads 317165 Variability of Climatic Elements in Nigeria Over Recent 100 Years
Authors: T. Salami, O. S. Idowu, N. J. Bello
Abstract:
Climatic variability is an essential issue when dealing with the issue of climate change. Variability of some climate parameter helps to determine how variable the climatic condition of a region will behave. The most important of these climatic variables which help to determine the climatic condition in an area are both the Temperature and Precipitation. This research deals with Longterm climatic variability in Nigeria. Variables examined in this analysis include near-surface temperature, near surface minimum temperature, maximum temperature, relative humidity, vapour pressure, precipitation, wet-day frequency and cloud cover using data ranging between 1901-2010. Analyses were carried out and the following methods were used: - Regression and EOF analysis. Results show that the annual average, minimum and maximum near-surface temperature all gradually increases from 1901 to 2010. And they are in the same case in a wet season and dry season. Minimum near-surface temperature, with its linear trends are significant for annual, wet season and dry season means. However, the diurnal temperature range decreases in the recent 100 years imply that the minimum near-surface temperature has increased more than the maximum. Both precipitation and wet day frequency decline from the analysis, demonstrating that Nigeria has become dryer than before by the way of rainfall. Temperature and precipitation variability has become very high during these periods especially in the Northern areas. Areas which had excessive rainfall were confronted with flooding and other related issues while area that had less precipitation were all confronted with drought. More practical issues will be presented.Keywords: climate, variability, flooding, excessive rainfall
Procedia PDF Downloads 384164 Impact of Ecosystem Engineers on Soil Structuration in a Restored Floodplain in Switzerland
Authors: Andreas Schomburg, Claire Le Bayon, Claire Guenat, Philip Brunner
Abstract:
Numerous river restoration projects have been established in Switzerland in recent years after decades of human activity in floodplains. The success of restoration projects in terms of biodiversity and ecosystem functions largely depend on the development of the floodplain soil system. Plants and earthworms as ecosystem engineers are known to be able to build up a stable soil structure by incorporating soil organic matter into the soil matrix that creates water stable soil aggregates. Their engineering efficiency however largely depends on changing soil properties and frequent floods along an evolutive floodplain transect. This study, therefore, aims to quantify the effect of flood frequency and duration as well as of physico-chemical soil parameters on plants’ and earthworms’ engineering efficiency. It is furthermore predicted that these influences may have a different impact on one of the engineers that leads to a varying contribution to aggregate formation within the floodplain transect. Ecosystem engineers were sampled and described in three different floodplain habitats differentiated according to the evolutionary stages of the vegetation ranging from pioneer to forest vegetation in a floodplain restored 15 years ago. In addition, the same analyses were performed in an embanked adjacent pasture as a reference for the pre-restored state. Soil aggregates were collected and analyzed for their organic matter quantity and quality using Rock Eval pyrolysis. Water level and discharge measurements dating back until 2008 were used to quantify the return period of major floods. Our results show an increasing amount of water stable aggregates in soil with increasing distance to the river and show largest values in the reference site. A decreasing flood frequency and the proportion of silt and clay in the soil texture explain these findings according to F values from one way ANOVA of a fitted mixed effect model. Significantly larger amounts of labile organic matter signatures were found in soil aggregates in the forest habitat and in the reference site that indicates a larger contribution of plants to soil aggregation in these habitats compared to the pioneer vegetation zone. Earthworms’ contribution to soil aggregation does not show significant differences in the floodplain transect, but their effect could be identified even in the pioneer vegetation with its large proportion of coarse sand in the soil texture and frequent inundations. These findings indicate that ecosystem engineers seem to be able to create soil aggregates even under unfavorable soil conditions and under frequent floods. A restoration success can therefore be expected even in ecosystems with harsh soil properties and frequent external disturbances.Keywords: ecosystem engineers, flood frequency, floodplains, river restoration, rock eval pyrolysis, soil organic matter incorporation, soil structuration
Procedia PDF Downloads 269163 Drivers on Climate in a Neotropical City: Urbanizations and Natural Variability
Authors: Nuria Vargas, Frances Rodriguez
Abstract:
Neotropical medium cities have opportunities to develop in a good manner. Xalapa City (Veracruz capital, Mexico) and its metropolitan region, near to the Gulf of Mexico, has already <1 million inhabitants, a medium city size, but it’s growing rapidly as several cities in Latin America. Inside a landscape where it had been a forest cloud and coffee land, emerges the city with an irregular topography. The rapid grow of the urbanization and the loss of vegetation has result in a change on the climate parameters. Frequently warms spells, floods and landslides had been impacted last 2 decades, also a higher incidence of dengue and diarrhea is mentioned in the region. Therefore, the analysis of hydrometeorological events is crucial to understand the role they play in its problem. The urbanization and others radiative forces has created a modulation that can explain the decadal climate changes on the Xalapa region. The Atlantic Multidecadal Oscillation directly influences the temperature and precipitation of the region, even more than climate change does. The total effect of these drivers can create a significant context that origin more risk. However, the most policies frequently consider only the climate change as a principal factor, but other drivers are important to consider and evaluate for the implementation of actions that improve our ambient and cities, in a context of climate change. Medium-sized cities could create better conditions for future citizens, preventing with urban planning that considers possible risks associated with weather and climate.Keywords: natural variability, urbanization, atlantic multidecadal oscillation, land use changes
Procedia PDF Downloads 64162 Cracks Detection and Measurement Using VLP-16 LiDAR and Intel Depth Camera D435 in Real-Time
Authors: Xinwen Zhu, Xingguang Li, Sun Yi
Abstract:
Crack is one of the most common damages in buildings, bridges, roads and so on, which may pose safety hazards. However, cracks frequently happen in structures of various materials. Traditional methods of manual detection and measurement, which are known as subjective, time-consuming, and labor-intensive, are gradually unable to meet the needs of modern development. In addition, crack detection and measurement need be safe considering space limitations and danger. Intelligent crack detection has become necessary research. In this paper, an efficient method for crack detection and quantification using a 3D sensor, LiDAR, and depth camera is proposed. This method works even in a dark environment, which is usual in real-world applications. The LiDAR rapidly spins to scan the surrounding environment and discover cracks through lasers thousands of times per second, providing a rich, 3D point cloud in real-time. The LiDAR provides quite accurate depth information. The precision of the distance of each point can be determined within around ±3 cm accuracy, and not only it is good for getting a precise distance, but it also allows us to see far of over 100m going with the top range models. But the accuracy is still large for some high precision structures of material. To make the depth of crack is much more accurate, the depth camera is in need. The cracks are scanned by the depth camera at the same time. Finally, all data from LiDAR and Depth cameras are analyzed, and the size of the cracks can be quantified successfully. The comparison shows that the minimum and mean absolute percentage error between measured and calculated width are about 2.22% and 6.27%, respectively. The experiments and results are presented in this paper.Keywords: LiDAR, depth camera, real-time, detection and measurement
Procedia PDF Downloads 224161 Using Geospatial Analysis to Reconstruct the Thunderstorm Climatology for the Washington DC Metropolitan Region
Authors: Mace Bentley, Zhuojun Duan, Tobias Gerken, Dudley Bonsal, Henry Way, Endre Szakal, Mia Pham, Hunter Donaldson, Chelsea Lang, Hayden Abbott, Leah Wilcynzski
Abstract:
Air pollution has the potential to modify the lifespan and intensity of thunderstorms and the properties of lightning. Using data mining and geovisualization, we investigate how background climate and weather conditions shape variability in urban air pollution and how this, in turn, shapes thunderstorms as measured by the intensity, distribution, and frequency of cloud-to-ground lightning. A spatiotemporal analysis was conducted in order to identify thunderstorms using high-resolution lightning detection network data. Over seven million lightning flashes were used to identify more than 196,000 thunderstorms that occurred between 2006 - 2020 in the Washington, DC Metropolitan Region. Each lightning flash in the dataset was grouped into thunderstorm events by means of a temporal and spatial clustering algorithm. Once the thunderstorm event database was constructed, hourly wind direction, wind speed, and atmospheric thermodynamic data were added to the initiation and dissipation times and locations for the 196,000 identified thunderstorms. Hourly aerosol and air quality data for the thunderstorm initiation times and locations were also incorporated into the dataset. Developing thunderstorm climatologies using a lightning tracking algorithm and lightning detection network data was found to be useful for visualizing the spatial and temporal distribution of urban augmented thunderstorms in the region.Keywords: lightning, urbanization, thunderstorms, climatology
Procedia PDF Downloads 76160 Multi-Temporal Mapping of Built-up Areas Using Daytime and Nighttime Satellite Images Based on Google Earth Engine Platform
Authors: S. Hutasavi, D. Chen
Abstract:
The built-up area is a significant proxy to measure regional economic growth and reflects the Gross Provincial Product (GPP). However, an up-to-date and reliable database of built-up areas is not always available, especially in developing countries. The cloud-based geospatial analysis platform such as Google Earth Engine (GEE) provides an opportunity with accessibility and computational power for those countries to generate the built-up data. Therefore, this study aims to extract the built-up areas in Eastern Economic Corridor (EEC), Thailand using day and nighttime satellite imagery based on GEE facilities. The normalized indices were generated from Landsat 8 surface reflectance dataset, including Normalized Difference Built-up Index (NDBI), Built-up Index (BUI), and Modified Built-up Index (MBUI). These indices were applied to identify built-up areas in EEC. The result shows that MBUI performs better than BUI and NDBI, with the highest accuracy of 0.85 and Kappa of 0.82. Moreover, the overall accuracy of classification was improved from 79% to 90%, and error of total built-up area was decreased from 29% to 0.7%, after night-time light data from the Visible and Infrared Imaging Suite (VIIRS) Day Night Band (DNB). The results suggest that MBUI with night-time light imagery is appropriate for built-up area extraction and be utilize for further study of socioeconomic impacts of regional development policy over the EEC region.Keywords: built-up area extraction, google earth engine, adaptive thresholding method, rapid mapping
Procedia PDF Downloads 126159 Time and Cost Prediction Models for Language Classification Over a Large Corpus on Spark
Authors: Jairson Barbosa Rodrigues, Paulo Romero Martins Maciel, Germano Crispim Vasconcelos
Abstract:
This paper presents an investigation of the performance impacts regarding the variation of five factors (input data size, node number, cores, memory, and disks) when applying a distributed implementation of Naïve Bayes for text classification of a large Corpus on the Spark big data processing framework. Problem: The algorithm's performance depends on multiple factors, and knowing before-hand the effects of each factor becomes especially critical as hardware is priced by time slice in cloud environments. Objectives: To explain the functional relationship between factors and performance and to develop linear predictor models for time and cost. Methods: the solid statistical principles of Design of Experiments (DoE), particularly the randomized two-level fractional factorial design with replications. This research involved 48 real clusters with different hardware arrangements. The metrics were analyzed using linear models for screening, ranking, and measurement of each factor's impact. Results: Our findings include prediction models and show some non-intuitive results about the small influence of cores and the neutrality of memory and disks on total execution time, and the non-significant impact of data input scale on costs, although notably impacts the execution time.Keywords: big data, design of experiments, distributed machine learning, natural language processing, spark
Procedia PDF Downloads 120158 The Publishing Process and Results of the Chinese Annotated Edition of John Dewey’s “Experience and Education: The 60th Anniversary Edition”
Authors: Wen-jing Shan
Abstract:
The Chinese annotated edition of “Experience and education: The 60th anniversary edition,” originally written in English by John Dewey (1859-1952), was published in 2015 by this author. A report of the process and results of the translation and annotation of the book is the purpose of this paper. It is worth mentioning that the original 1938 edition was considered as the best concise statement on education by John Dewey, one the most important educational theorists of the twentieth century. One of the features of this The 60th anniversary edition is that the original publisher, Kappa Delta Pi International Honor Society, invited four contemporary Deweyan scholars who had been awarded the Society’s Laureate Scholar to write a review of the book published by Dewey, who was the first to receive this honor. The four scholars are Maxine Greene(1917-2014), Philip W. Jackson(1928-2015), Linda Darling-Hammond(1951-), and O. L. Davis, Jr.(1928-). The original 1938 edition, the best concise statement on education by the most important educational theorist of the twentieth century, was translated into Chinese for five times after its publication in the U.S.A, three in the 1940s, one in the 1990s, and one in 2010s. Nonetheless, the five translations have few or no annotations and have some flaws of mis-interpretations and lack of information. The author retranslated and annotated the book to make the interpretations more faithful, expressive, and elegant, and providing the readers with more understanding and more correct information. This author started the project of translation and annotation sponsored by Taiwan Ministry of Science and Technology in August 2011 and finished and published by July 2015. The work, the author, did was divided into three stages. First, in the preparatory stage of the project, the summary of each chapter, the rationale of the book, the textual commentary, the development of the original and Chinese editions, and reviews and criticisms, as well as Dewey’s biography and bibliography were initially investigated. Secondly, on the basis of the above preliminary work, the translation with annotation of Experience and Education, an epitome of Dewey’s biography and bibliography, a chronology, and a critical introduction for the Experience and Education were written. In the critical introduction, Dewey’s philosophy of experience and educational ideas will be examined along the timeline of human thought. And the vast literature about Dewey and his work will be instrumental to reveal the historical significance of Experience and Education on the modern age and make the critical introduction more knowledgeable. Third, the final stage took another two years to review and revise the draft of the work and send it for publication. There are two parts in the book. The first part is a scholarly introduction including Dewey’s chronicle (in short form), Dewey’s mind, people and life, the importance of “Experience and education”, the necessity of re-translation and re-annotation of “Experience and education” into Chinese. The second part is the re-translation and re-annotation version, including Dewey’s “Experience and education” and four papers written by contemporary scholars.Keywords: John Dewey, experience and education: the 60th anniversary edition, translation, annotation
Procedia PDF Downloads 162157 High-Resolution Spatiotemporal Retrievals of Aerosol Optical Depth from Geostationary Satellite Using Sara Algorithm
Authors: Muhammad Bilal, Zhongfeng Qiu
Abstract:
Aerosols, suspended particles in the atmosphere, play an important role in the earth energy budget, climate change, degradation of atmospheric visibility, urban air quality, and human health. To fully understand aerosol effects, retrieval of aerosol optical properties such as aerosol optical depth (AOD) at high spatiotemporal resolution is required. Therefore, in the present study, hourly AOD observations at 500 m resolution were retrieved from the geostationary ocean color imager (GOCI) using the simplified aerosol retrieval algorithm (SARA) over the urban area of Beijing for the year 2016. The SARA requires top-of-the-atmosphere (TOA) reflectance, solar and sensor geometry information and surface reflectance observations to retrieve an accurate AOD. For validation of the GOCI retrieved AOD, AOD measurements were obtained from the aerosol robotic network (AERONET) version 3 level 2.0 (cloud-screened and quality assured) data. The errors and uncertainties were reported using the root mean square error (RMSE), relative percent mean error (RPME), and the expected error (EE = ± (0.05 + 0.15AOD). Results showed that the high spatiotemporal GOCI AOD observations were well correlated with the AERONET AOD measurements with a correlation coefficient (R) of 0.92, RMSE of 0.07, and RPME of 5%, and 90% of the observations were within the EE. The results suggested that the SARA is robust and has the ability to retrieve high-resolution spatiotemporal AOD observations over the urban area using the geostationary satellite.Keywords: AEORNET, AOD, SARA, GOCI, Beijing
Procedia PDF Downloads 171156 Security Design of Root of Trust Based on RISC-V
Authors: Kang Huang, Wanting Zhou, Shiwei Yuan, Lei Li
Abstract:
Since information technology develops rapidly, the security issue has become an increasingly critical for computer system. In particular, as cloud computing and the Internet of Things (IoT) continue to gain widespread adoption, computer systems need to new security threats and attacks. The Root of Trust (RoT) is the foundation for providing basic trusted computing, which is used to verify the security and trustworthiness of other components. Design a reliable Root of Trust and guarantee its own security are essential for improving the overall security and credibility of computer systems. In this paper, we discuss the implementation of self-security technology based on the RISC-V Root of Trust at the hardware level. To effectively safeguard the security of the Root of Trust, researches on security safeguard technology on the Root of Trust have been studied. At first, a lightweight and secure boot framework is proposed as a secure mechanism. Secondly, two kinds of memory protection mechanism are built to against memory attacks. Moreover, hardware implementation of proposed method has been also investigated. A series of experiments and tests have been carried on to verify to effectiveness of the proposed method. The experimental results demonstrated that the proposed approach is effective in verifying the integrity of the Root of Trust’s own boot rom, user instructions, and data, ensuring authenticity and enabling the secure boot of the Root of Trust’s own system. Additionally, our approach provides memory protection against certain types of memory attacks, such as cache leaks and tampering, and ensures the security of root-of-trust sensitive information, including keys.Keywords: root of trust, secure boot, memory protection, hardware security
Procedia PDF Downloads 216155 Vulnerability Assessment of Reinforced Concrete Frames Based on Inelastic Spectral Displacement
Authors: Chao Xu
Abstract:
Selecting ground motion intensity measures reasonably is one of the very important issues to affect the input ground motions selecting and the reliability of vulnerability analysis results. In this paper, inelastic spectral displacement is used as an alternative intensity measure to characterize the ground motion damage potential. The inelastic spectral displacement is calculated based modal pushover analysis and inelastic spectral displacement based incremental dynamic analysis is developed. Probability seismic demand analysis of a six story and an eleven story RC frame are carried out through cloud analysis and advanced incremental dynamic analysis. The sufficiency and efficiency of inelastic spectral displacement are investigated by means of regression and residual analysis, and compared with elastic spectral displacement. Vulnerability curves are developed based on inelastic spectral displacement. The study shows that inelastic spectral displacement reflects the impact of different frequency components with periods larger than fundamental period on inelastic structural response. The damage potential of ground motion on structures with fundamental period prolonging caused by structural soften can be caught by inelastic spectral displacement. To be compared with elastic spectral displacement, inelastic spectral displacement is a more sufficient and efficient intensity measure, which reduces the uncertainty of vulnerability analysis and the impact of input ground motion selection on vulnerability analysis result.Keywords: vulnerability, probability seismic demand analysis, ground motion intensity measure, sufficiency, efficiency, inelastic time history analysis
Procedia PDF Downloads 354154 Improving Cheon-Kim-Kim-Song (CKKS) Performance with Vector Computation and GPU Acceleration
Authors: Smaran Manchala
Abstract:
Homomorphic Encryption (HE) enables computations on encrypted data without requiring decryption, mitigating data vulnerability during processing. Usable Fully Homomorphic Encryption (FHE) could revolutionize secure data operations across cloud computing, AI training, and healthcare, providing both privacy and functionality, however, the computational inefficiency of schemes like Cheon-Kim-Kim-Song (CKKS) hinders their widespread practical use. This study focuses on optimizing CKKS for faster matrix operations through the implementation of vector computation parallelization and GPU acceleration. The variable effects of vector parallelization on GPUs were explored, recognizing that while parallelization typically accelerates operations, it could introduce overhead that results in slower runtimes, especially in smaller, less computationally demanding operations. To assess performance, two neural network models, MLPN and CNN—were tested on the MNIST dataset using both ARM and x86-64 architectures, with CNN chosen for its higher computational demands. Each test was repeated 1,000 times, and outliers were removed via Z-score analysis to measure the effect of vector parallelization on CKKS performance. Model accuracy was also evaluated under CKKS encryption to ensure optimizations did not compromise results. According to the results of the trail runs, applying vector parallelization had a 2.63X efficiency increase overall with a 1.83X performance increase for x86-64 over ARM architecture. Overall, these results suggest that the application of vector parallelization in tandem with GPU acceleration significantly improves the efficiency of CKKS even while accounting for vector parallelization overhead, providing impact in future zero trust operations.Keywords: CKKS scheme, runtime efficiency, fully homomorphic encryption (FHE), GPU acceleration, vector parallelization
Procedia PDF Downloads 24153 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture
Authors: Sajjad Akbar, Rabia Bashir
Abstract:
With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.Keywords: agent based web content mining, content centric networking, information centric networking
Procedia PDF Downloads 475152 Dow Polyols near Infrared Chemometric Model Reduction Based on Clustering: Reducing Thirty Global Hydroxyl Number (OH) Models to Less Than Five
Authors: Wendy Flory, Kazi Czarnecki, Matthijs Mercy, Mark Joswiak, Mary Beth Seasholtz
Abstract:
Polyurethane Materials are present in a wide range of industrial segments such as Furniture, Building and Construction, Composites, Automotive, Electronics, and more. Dow is one of the leaders for the manufacture of the two main raw materials, Isocyanates and Polyols used to produce polyurethane products. Dow is also a key player for the manufacture of Polyurethane Systems/Formulations designed for targeted applications. In 1990, the first analytical chemometric models were developed and deployed for use in the Dow QC labs of the polyols business for the quantification of OH, water, cloud point, and viscosity. Over the years many models have been added; there are now over 140 models for quantification and hundreds for product identification, too many to be reasonable for support. There are 29 global models alone for the quantification of OH across > 70 products at many sites. An attempt was made to consolidate these into a single model. While the consolidated model proved good statistics across the entire range of OH, several products had a bias by ASTM E1655 with individual product validation. This project summary will show the strategy for global model updates for OH, to reduce the number of models for quantification from over 140 to 5 or less using chemometric methods. In order to gain an understanding of the best product groupings, we identify clusters by reducing spectra to a few dimensions via Principal Component Analysis (PCA) and Uniform Manifold Approximation and Projection (UMAP). Results from these cluster analyses and a separate validation set allowed dow to reduce the number of models for predicting OH from 29 to 3 without loss of accuracy.Keywords: hydroxyl, global model, model maintenance, near infrared, polyol
Procedia PDF Downloads 135151 Field-Testing a Digital Music Notebook
Authors: Rena Upitis, Philip C. Abrami, Karen Boese
Abstract:
The success of one-on-one music study relies heavily on the ability of the teacher to provide sufficient direction to students during weekly lessons so that they can successfully practice from one lesson to the next. Traditionally, these instructions are given in a paper notebook, where the teacher makes notes for the students after describing a task or demonstrating a technique. The ability of students to make sense of these notes varies according to their understanding of the teacher’s directions, their motivation to practice, their memory of the lesson, and their abilities to self-regulate. At best, the notes enable the student to progress successfully. At worst, the student is left rudderless until the next lesson takes place. Digital notebooks have the potential to provide a more interactive and effective bridge between music lessons than traditional pen-and-paper notebooks. One such digital notebook, Cadenza, was designed to streamline and improve teachers’ instruction, to enhance student practicing, and to provide the means for teachers and students to communicate between lessons. For example, Cadenza contains a video annotator, where teachers can offer real-time guidance on uploaded student performances. Using the checklist feature, teachers and students negotiate the frequency and type of practice during the lesson, which the student can then access during subsequent practice sessions. Following the tenets of self-regulated learning, goal setting and reflection are also featured. Accordingly, the present paper addressed the following research questions: (1) How does the use of the Cadenza digital music notebook engage students and their teachers?, (2) Which features of Cadenza are most successful?, (3) Which features could be improved?, and (4) Is student learning and motivation enhanced with the use of the Cadenza digital music notebook? The paper describes the results 10 months of field-testing of Cadenza, structured around the four research questions outlined. Six teachers and 65 students took part in the study. Data were collected through video-recorded lesson observations, digital screen captures, surveys, and interviews. Standard qualitative protocols for coding results and identifying themes were employed to analyze the results. The results consistently indicated that teachers and students embraced the digital platform offered by Cadenza. The practice log and timer, the real-time annotation tool, the checklists, the lesson summaries, and the commenting features were found to be the most valuable functions, by students and teachers alike. Teachers also reported that students progressed more quickly with Cadenza, and received higher results in examinations than those students who were not using Cadenza. Teachers identified modifications to Cadenza that would make it an even more powerful way to support student learning. These modifications, once implemented, will move the tool well past its traditional notebook uses to new ways of motivating students to practise between lessons and to communicate with teachers about their learning. Improvements to the tool called for by the teachers included the ability to duplicate archived lessons, allowing for split screen viewing, and adding goal setting to the teacher window. In the concluding section, proposed modifications and their implications for self-regulated learning are discussed.Keywords: digital music technologies, electronic notebooks, self-regulated learning, studio music instruction
Procedia PDF Downloads 254150 Extraction of Dyes Using an Aqueous Two-Phase System in Stratified and Slug Flow Regimes of a Microchannel
Authors: Garima, S. Pushpavanam
Abstract:
In this work, analysis of an Aqueous two-phase (polymer-salt) system for extraction of sunset yellow dye is carried out. A polymer-salt ATPS i.e.; Polyethylene glycol-600 and anhydrous sodium sulfate is used for the extraction. Conditions are chosen to ensure that the extraction results in a concentration of the dye in one of the phases. The dye has a propensity to come to the Polyethylene glycol-600 phase. This extracted sunset yellow dye is degraded photo catalytically into less harmful components. The cloud point method was used to obtain the binodal curve of ATPS. From the binodal curve, the composition of salt and Polyethylene glycol -600 was chosen such that the volume of Polyethylene glycol-600 rich phase is low. This was selected to concentrate the dye from a dilute solution in a large volume of contaminated solution into a small volume. This pre-concentration step provides a high reaction rate for photo catalytic degradation reaction. Experimentally the dye is extracted from the salt phase to Polyethylene glycol -600 phase in batch extraction. This was found to be very fast and all dye was extracted. The concentration of sunset yellow dye in salt and polymer phase is measured at 482nm by ultraviolet-visible spectrophotometry. The extraction experiment in micro channels under stratified flow is analyzed to determine factors which affect the dye extraction. Focus will be on obtaining slug flow by adding nanoparticles in micro channel. The primary aim is to exploit the fact that slug flow will help improve mass transfer rate from one phase to another through internal circulation in dispersed phase induced by shear.Keywords: aqueous two phase system, binodal curve, extraction, sunset yellow dye
Procedia PDF Downloads 358149 FSO Performance under High Solar Irradiation: Case Study Qatar
Authors: Syed Jawad Hussain, Abir Touati, Farid Touati
Abstract:
Free-Space Optics (FSO) is a wireless technology that enables the optical transmission of data though the air. FSO is emerging as a promising alternative or complementary technology to fiber optic and wireless radio-frequency (RF) links due to its high-bandwidth, robustness to EMI, and operation in unregulated spectrum. These systems are envisioned to be an essential part of future generation heterogeneous communication networks. Despite the vibrant advantages of FSO technology and the variety of its applications, its widespread adoption has been hampered by rather disappointing link reliability for long-range links due to atmospheric turbulence-induced fading and sensitivity to detrimental climate conditions. Qatar, with modest cloud coverage, high concentrations of airborne dust and high relative humidity particularly lies in virtually rainless sunny belt with a typical daily average solar radiation exceeding 6 kWh/m2 and 80-90% clear skies throughout the year. The specific objective of this work is to study for the first time in Qatar the effect of solar irradiation on the deliverability of the FSO Link. In order to analyze the transport media, we have ported Embedded Linux kernel on Field Programmable Gate Array (FPGA) and designed a network sniffer application that can run into FPGA. We installed new FSO terminals and configure and align them successively. In the reporting period, we carry out measurement and relate them to weather conditions.Keywords: free space optics, solar irradiation, field programmable gate array, FSO outage
Procedia PDF Downloads 361148 Image Recognition Performance Benchmarking for Edge Computing Using Small Visual Processing Unit
Authors: Kasidis Chomrat, Nopasit Chakpitak, Anukul Tamprasirt, Annop Thananchana
Abstract:
Internet of Things devices or IoT and Edge Computing has become one of the biggest things happening in innovations and one of the most discussed of the potential to improve and disrupt traditional business and industry alike. With rises of new hang cliff challenges like COVID-19 pandemic that posed a danger to workforce and business process of the system. Along with drastically changing landscape in business that left ruined aftermath of global COVID-19 pandemic, looming with the threat of global energy crisis, global warming, more heating global politic that posed a threat to become new Cold War. How emerging technology like edge computing and usage of specialized design visual processing units will be great opportunities for business. The literature reviewed on how the internet of things and disruptive wave will affect business, which explains is how all these new events is an effect on the current business and how would the business need to be adapting to change in the market and world, and example test benchmarking for consumer marketed of newer devices like the internet of things devices equipped with new edge computing devices will be increase efficiency and reducing posing a risk from a current and looming crisis. Throughout the whole paper, we will explain the technologies that lead the present technologies and the current situation why these technologies will be innovations that change the traditional practice through brief introductions to the technologies such as cloud computing, edge computing, Internet of Things and how it will be leading into future.Keywords: internet of things, edge computing, machine learning, pattern recognition, image classification
Procedia PDF Downloads 155147 Infrastructure Change Monitoring Using Multitemporal Multispectral Satellite Images
Authors: U. Datta
Abstract:
The main objective of this study is to find a suitable approach to monitor the land infrastructure growth over a period of time using multispectral satellite images. Bi-temporal change detection method is unable to indicate the continuous change occurring over a long period of time. To achieve this objective, the approach used here estimates a statistical model from series of multispectral image data over a long period of time, assuming there is no considerable change during that time period and then compare it with the multispectral image data obtained at a later time. The change is estimated pixel-wise. Statistical composite hypothesis technique is used for estimating pixel based change detection in a defined region. The generalized likelihood ratio test (GLRT) is used to detect the changed pixel from probabilistic estimated model of the corresponding pixel. The changed pixel is detected assuming that the images have been co-registered prior to estimation. To minimize error due to co-registration, 8-neighborhood pixels around the pixel under test are also considered. The multispectral images from Sentinel-2 and Landsat-8 from 2015 to 2018 are used for this purpose. There are different challenges in this method. First and foremost challenge is to get quite a large number of datasets for multivariate distribution modelling. A large number of images are always discarded due to cloud coverage. Due to imperfect modelling there will be high probability of false alarm. Overall conclusion that can be drawn from this work is that the probabilistic method described in this paper has given some promising results, which need to be pursued further.Keywords: co-registration, GLRT, infrastructure growth, multispectral, multitemporal, pixel-based change detection
Procedia PDF Downloads 135146 Improving Waste Recycling and Resource Productivity by Integrating Smart Resource Tracking System
Authors: Atiq Zaman
Abstract:
The high contamination rate in the recycling waste stream is one of the major problems in Australia. In addition, a lack of reliable waste data makes it even more difficult for designing and implementing an effective waste management plan. This article conceptualizes the opportunity to improve resource productivity by integrating smart resource tracking system (SRTS) into the Australian household waste management system. The application of the smart resource tracking system will be implemented through the following ways: (i) mobile application-based resource tracking system used to measure the household’s material flow; (ii) RFID, smart image and weighing system used to track waste generation, recycling and contamination; (iii) informing and motivating manufacturer and retailers to improve their problematic products’ packaging; and (iv) ensure quality and reliable data through open-sourced cloud data for public use. The smart mobile application, imaging, radio-frequency identification (RFID) and weighing technologies are not new, but the very straightforward idea of using these technologies in the household resource consumption, waste bins and collection trucks will open up a new era of accurately measuring and effectively managing our waste. The idea will bring the most urgently needed reliable, data and clarity on household consumption, recycling behaviour and waste management practices in the context of available local infrastructure and policies. Therefore, the findings of this study would be very important for decision makers to improve resource productivity in the waste industry by using smart resource tracking system.Keywords: smart devices, mobile application, smart sensors, resource tracking, waste management, resource productivity
Procedia PDF Downloads 144