Search results for: digital developments
917 An Interpretative Historical Analysis of Asylum and Refugee Policies and Attitudes to Australian Immigration Laws
Authors: Kamal Kithsiri Karunadasa Hewawasam Revulge
Abstract:
This paper is an interpretative historical analysis of Australian migration laws that examines asylum and refugee policies and attitudes in Australia. It looks at major turning points in Australian migration history, and in doing so, the researcher reviewed relevant literature on the aspects crucial to highlighting the current trend of Australian migration policies. The data was collected using secondary data from official government sources, including annual reports, media releases on immigration, inquiry reports, statistical information, and other available literature to identify critical historical events that significantly affected the systematic developments of asylum seekers and refugee policies in Australia and to look at the historical trends of official thinking. A reliance on using these official sources is justified as those are the most convincing sources to analyse the historical events in Australia. Additional literature provides us with critical analyses of the behaviour and culture of the Australian immigration administration. The analytical framework reviewed key Australian Government immigration policies since British colonization and the settlement era of 1787–the 1850s and to the present. The fundamental basis for doing so is that past events and incidents offer us clues and lessons relevant to the present day. Therefore, providing a perspective on migration history in Australia helps analyse how current policymakers' strategies developed and changed over time. Attention is also explicitly focused on Australian asylum and refugee policy internationally, as it helped to broaden the analysis. The finding proved a link between past events and adverse current Australian government policies towards asylum seekers and refugees. It highlighted that Australia's current migration policies are part of a carefully and deliberately planned pattern that arose from the occupation of Australia by early British settlers. In this context, the remarkable point is that the historical events of taking away children from their Australian indigenous parents, widely known as the 'stolen generation' reflected a model of assimilation, or a desire to absorb other cultures into Australian society by fully adopting the settlers' language, their culture, and losing indigenous people's traditions. Current Australian policies towards migrants reflect the same attitude. Hence, it could be argued that policies and attitudes towards asylum seekers and refugees, particularly so-called 'boat people' to some extent, still reflect Australia's earlier colonial and 'white Australia' history.Keywords: migration law, refugee law, international law, administrative law
Procedia PDF Downloads 84916 Privacy Protection Principles of Omnichannel Approach
Authors: Renata Mekovec, Dijana Peras, Ruben Picek
Abstract:
The advent of the Internet, mobile devices and social media is revolutionizing the experience of retail customers by linking multiple sources through various channels. Omnichannel retailing is a retailing that combines multiple channels to allow customers to seamlessly leverage all the distribution information online and offline while shopping. Therefore, today data are an asset more critical than ever for all organizations. Nonetheless, because of its heterogeneity through platforms, developers are currently facing difficulties in dealing with personal data. Considering the possibilities of omnichannel communication, this paper presents channel categorization that could enhance the customer experience of omnichannel center called hyper center. The purpose of this paper is fundamentally to describe the connection between the omnichannel hyper center and the customer, with particular attention to privacy protection. The first phase was finding the most appropriate channels of communication for hyper center. Consequently, a selection of widely used communication channels has been identified and analyzed with regard to the effect requirements for optimizing user experience. The evaluation criteria are divided into 3 groups: general, user profile and channel options. For each criterion the weight of importance for omnichannel communication was defined. The most important thing was to consider how the hyper center can make user identification while respecting the privacy protection requirements. The study carried out also shows what customer experience across digital networks would look like, based on an omnichannel approach owing to privacy protection principles.Keywords: personal data, privacy protection, omnichannel communication, retail
Procedia PDF Downloads 148915 Method for Selecting and Prioritising Smart Services in Manufacturing Companies
Authors: Till Gramberg, Max Kellner, Erwin Gross
Abstract:
This paper presents a comprehensive investigation into the topic of smart services and IIoT-Platforms, focusing on their selection and prioritization in manufacturing organizations. First, a literature review is conducted to provide a basic understanding of the current state of research in the area of smart services. Based on discussed and established definitions, a definition approach for this paper is developed. In addition, value propositions for smart services are identified based on the literature and expert interviews. Furthermore, the general requirements for the provision of smart services are presented. Subsequently, existing approaches for the selection and development of smart services are identified and described. In order to determine the requirements for the selection of smart services, expert opinions from successful companies that have already implemented smart services are collected through semi-structured interviews. Based on the results, criteria for the evaluation of existing methods are derived. The existing methods are then evaluated according to the identified criteria. Furthermore, a novel method for the selection of smart services in manufacturing companies is developed, taking into account the identified criteria and the existing approaches. The developed concept for the method is verified in expert interviews. The method includes a collection of relevant smart services identified in the literature. The actual relevance of the use cases in the industrial environment was validated in an online survey. The required data and sensors are assigned to the smart service use cases. The value proposition of the use cases is evaluated in an expert workshop using different indicators. Based on this, a comparison is made between the identified value proposition and the required data, leading to a prioritization process. The prioritization process follows an established procedure for evaluating technical decision-making processes. In addition to the technical requirements, the prioritization process includes other evaluation criteria such as the economic benefit, the conformity of the new service offering with the company strategy, or the customer retention enabled by the smart service. Finally, the method is applied and validated in an industrial environment. The results of these experiments are critically reflected upon and an outlook on future developments in the area of smart services is given. This research contributes to a deeper understanding of the selection and prioritization process as well as the technical considerations associated with smart service implementation in manufacturing organizations. The proposed method serves as a valuable guide for decision makers, helping them to effectively select the most appropriate smart services for their specific organizational needs.Keywords: smart services, IIoT, industrie 4.0, IIoT-platform, big data
Procedia PDF Downloads 90914 Tussle of Intellectual Property Rights and Privacy Laws with Reference to Artificial Intelligence
Authors: Lipsa Dash, Gyanendra Sahu
Abstract:
Intelligence is the cornerstone of humans, and now they have created a counterpart of themselves artificially. Our understanding of the word intelligence is a very perspective based and mostly superior understanding of what we read, write, perceive and understand the adversities around better. A wide range of industrial sectors have also started involving the technology to perceive, reason and act. Similarly, intellectual property is the product of human intelligence and creativity. The World Intellectual Property Organisation is currently working on technology trends across the globe, and AI tops the list in the digital frontier that will have a profound impact on the world, transforming the way we live and work. Coming to Intellectual Property, patents and creations of the AI’s itself have constantly been in question. This paper explores whether AI’s can fit in the flexibilities of Trade Related Intellectual Property Studies and gaps in the existing IP laws or rthere is a need of amendment to include them in the ambit. The researcher also explores the right of AI’s who create things out of their intelligence and whether they could qualify to be legal persons making the other laws applicable on them. Differentiation between AI creations and human creations are explored in the paper, and the need of amendments to determine authorship, ownership, inventorship, protection, and identification of beneficiary for remuneration or even for determining liability. The humans and humanoids are all indulged in matters related to Privacy, and that attracts another constitutional legal issue to be addressed. The authors will be focusing on the legal conundrums of AI, transhumanism, and the Internet of things.Keywords: artificial intelligence, humanoids, healthcare, privacy, legal conundrums, transhumanism
Procedia PDF Downloads 126913 Statistical and Analytical Comparison of GIS Overlay Modelings: An Appraisal on Groundwater Prospecting in Precambrian Metamorphics
Authors: Tapas Acharya, Monalisa Mitra
Abstract:
Overlay modeling is the most widely used conventional analysis for spatial decision support system. Overlay modeling requires a set of themes with different weightage computed in varied manners, which gives a resultant input for further integrated analysis. In spite of the popularity and most widely used technique; it gives inconsistent and erroneous results for similar inputs while processed in various GIS overlay techniques. This study is an attempt to compare and analyse the differences in the outputs of different overlay methods using GIS platform with same set of themes of the Precambrian metamorphic to obtain groundwater prospecting in Precambrian metamorphic rocks. The objective of the study is to emphasize the most suitable overlay method for groundwater prospecting in older Precambrian metamorphics. Seven input thematic layers like slope, Digital Elevation Model (DEM), soil thickness, lineament intersection density, average groundwater table fluctuation, stream density and lithology have been used in the spatial overlay models of fuzzy overlay, weighted overlay and weighted sum overlay methods to yield the suitable groundwater prospective zones. Spatial concurrence analysis with high yielding wells of the study area and the statistical comparative studies among the outputs of various overlay models using RStudio reveal that the Weighted Overlay model is the most efficient GIS overlay model to delineate the groundwater prospecting zones in the Precambrian metamorphic rocks.Keywords: fuzzy overlay, GIS overlay model, groundwater prospecting, Precambrian metamorphics, weighted overlay, weighted sum overlay
Procedia PDF Downloads 128912 Blade Runner and Slavery in the 21st Century
Authors: Bülent Diken
Abstract:
This paper looks to set Ridley Scott’s original film Blade Runner (1982) and Denis Villeneuve’s Blade Runner 2049 (2017) in order to provide an analysis of both films with respect to the new configurations of slavery in the 21st century. Both Blade Runner films present a de-politicized society that oscillates between two extremes: the spectral (the eye, optics, digital communications) and the biopolitical (the body, haptics). On the one hand, recognizing the subject only as a sign, the society of the spectacle registers, identifies, produces and reproduces the subject as a code. At the same time, though, the subject is constantly reduced to a naked body, to bare life, for biometric technologies to scan it as a biological body or body parts. Being simultaneously a pure code (word without body) and an instrument slave (body without word), the replicants are thus the paradigmatic subjects of this society. The paper focuses first on the similarity: both films depict a relationship between masters and slaves, that is, a despotic relationship. The master uses the (body of the) slave as an instrument, as an extension of his own body. Blade Runner 2019 frames the despotic relation in this classical way through its triangulation with the economy (the Tyrell Corporation) and the slave-replicants’ dissent (rejecting their reduction to mere instruments). In a counter-classical approach, in Blade Runner 2049, the focus shifts to another triangulation: despotism, economy (the Wallace Corporation) and consent (of replicants who no longer perceive themselves as slaves).Keywords: Blade Runner, the spectacle, bio-politics, slavery, imstrumentalisation
Procedia PDF Downloads 69911 Unionisation, Participation and Democracy: Forms of Convergence and Divergence between Union Membership and Civil and Political Activism in European Countries
Authors: Silvia Lucciarini, Antonio Corasaniti
Abstract:
The issue of democracy in capitalist countries has once again become the focus of debate in recent years. A number of socio-economic and political tensions have triggered discussion of this topic from various perspectives and disciplines. Political developments, the rise of both right-wing parties and populism and the constant growth of inequalities in a context of welfare downsizing, have led scholars to question if European capitalist countries are really capable of creating and redistributing resources and look for elements that might make democratic capital in European countries more dense. The aim of the work is to shed light on the trajectories, intensity and convergence or divergence between political and associative participation, on one hand, and organization, on the other, as these constitute two of the main points of connection between the norms, values and actions that bind citizens to the state. Using the European Social Survey database, some studies have sought to analyse degrees of unionization by investigating the relationship between systems of industrial relations and vulnerable groups (in terms of value-oriented practices or political participation). This paper instead aims to investigate the relationship between union participation and civil/political participation, comparing union members and non-members and then distinguishing between employees and self-employed professionals to better understand participatory behaviors among different workers. The first component of the research will employ a multilinear logistic model to examine a sample of 10 countries selected according to a grid that combines the industrial relations models identified by Visser (2006) and the Welfare State systems identified by Esping-Andersen (1990). On the basis of this sample, we propose to compare the choices made by workers and their propensity to join trade unions, together with their level of social and political participation, from 2002 to 2016. In the second component, we aim to verify whether workers within the same system of industrial relations and welfare show a similar propensity to engage in civil participation through political bodies and associations, or if instead these tendencies take on more specific and varied forms. The results will allow us to see: (1) if political participation is higher among unionized workers than it is among the non-unionized. (2) what are the differences in unionisation and civil/political participation between self-employed, temporary and full-time employees and (3) whether the trajectories within industrial relations and welfare models display greater inclusiveness and participation, thereby confirming or disproving the patterns that have been documented among the different European countries.Keywords: union membership, participation, democracy, industrial relations, welfare systems
Procedia PDF Downloads 142910 Performance Comparison of Tablet Devices and Medical Diagnostic Display Devices Using Digital Object Patterns in PACS Environment
Authors: Yan-Lin Liu, Cheng-Ting Shih, Jay Wu
Abstract:
Tablet devices have been introduced into the medical environment in recent years. The performance of display can be varied based on the use of different hardware specifications and types of display technologies. Therefore, the differences between tablet devices and medical diagnostic LCDs have to be verified to ensure that image quality is not jeopardized for clinical diagnosis in a picture archiving and communication system (PACS). In this study, a set of randomized object test patterns (ROTPs) were developed, which included randomly located spheres in abdominal CT images. Five radiologists were asked to independently review the CT images on different generations of iPads and a diagnostic monochrome medical LCD monitor. Receiver operating characteristic (ROC) analysis was performed by using a five-point rating scale, and the average area under curve (AUC) and average reading time (ART) were calculated. The AUC values for the second generation iPad, iPad mini, iPad Air, and monochrome medical monitor were 0.712, 0.717, 0.725, and 0.740, respectively. The differences between iPads were not significant. The ARTs were 177 min and 127 min for iPad mini and medical LCD monitor, respectively. A significant difference appeared (p = 0.04). The results show that the iPads were slightly inferior to the monochrome medical LCD monitor. However, tablet devices possess advantages in portability and versatility, which can improve the convenience of rapid diagnosis and teleradiology. With advances in display technology, the applicability of tablet devices and mobile devices may be more diversified in PACS.Keywords: tablet devices, PACS, receiver operating characteristic, LCD monitor
Procedia PDF Downloads 481909 The Post-Hegemony of Post-Capitalism: Towards a Political Theory of Open Cooperativism
Authors: Vangelis Papadimitropoulos
Abstract:
The paper is part of the research project “Techno-Social Innovation in the Collaborative Economy'', funded by the Hellenic Foundation of Research and Innovation for the years 2022-2024. The research project examines the normative and empirical conditions of grassroots technologically driven innovation, potentially enabling the transition towards a commons-oriented post-capitalist economy. The project carries out a conceptually led and empirically grounded multi-case study of the digital commons, open-source technologies, platform cooperatives, open cooperatives and Distributed Autonomous Organizations (DAOs) on the Blockchain. The methodological scope of research is interdisciplinary inasmuch as it comprises political theory, economics, sustainability science and computer science, among others. The research draws specifically on Michel Bauwens and Vasilis Kostakis' model of open cooperativism between the commons, ethical market entities and a partner state. Bauwens and Kostakis advocate for a commons-based counter-hegemonic post-capitalist transition beyond and against neoliberalism. The research further employs Laclau and Mouffe's discourse theory of hegemony to introduce a post-hegemonic conceptualization of the model of open cooperativism. Thus, the paper aims to outline the theoretical contribution of the research project to contemporary political theory debates on post-capitalism and the collaborative economy.Keywords: open cooperativism, techno-social innovation, post-hegemony, post-capitalism
Procedia PDF Downloads 66908 Immersive Environment as an Occupant-Centric Tool for Architecture Criticism and Architectural Education
Authors: Golnoush Rostami, Farzam Kharvari
Abstract:
In recent years, developments in the field of architectural education have resulted in a shift from conventional teaching methods to alternative state-of-the-art approaches in teaching methods and strategies. Criticism in architecture has been a key player both in the profession and education, but it has been mostly offered by renowned individuals. Hence, not only students or other professionals but also critics themselves may not have the option to experience buildings and rely on available 2D materials, such as images and plans, that may not result in a holistic understanding and evaluation of buildings. On the other hand, immersive environments provide students and professionals the opportunity to experience buildings virtually and reflect their evaluation by experiencing rather than judging based on 2D materials. Therefore, the aim of this study is to compare the effect of experiencing buildings in immersive environments and 2D drawings, including images and plans, on architecture criticism and architectural education. As a result, three buildings that have parametric brick facades were studied through 2D materials and in Unreal Engine v. 24 as an immersive environment among 22 architecture students that were selected using convenient sampling and were divided into two equal groups using simple random sampling. This study used mixed methods, including quantitative and qualitative methods; the quantitative section was carried out by a questionnaire, and deep interviews were used for the qualitative section. A questionnaire was developed for measuring three constructs, including privacy regulation based on Altman’s theory, the sufficiency of illuminance levels in the building, and the visual status of the view (visually appealing views based on obstructions that may have been caused by facades). Furthermore, participants had the opportunity to reflect their understanding and evaluation of the buildings in individual interviews. Accordingly, the collected data from the questionnaires were analyzed using independent t-test and descriptive analyses in IBM SPSS Statistics v. 26, and interviews were analyzed using the content analysis method. The results of the interviews showed that the participants who experienced the buildings in the immersive environment were able to have a thorough and more precise evaluation of the buildings in comparison to those who studied them through 2D materials. Moreover, the analyses of the respondents’ questionnaires revealed that there were statistically significant differences between measured constructs among the two groups. The outcome of this study suggests that integrating immersive environments into the profession and architectural education as an effective and efficient tool for architecture criticism is vital since these environments allow users to have a holistic evaluation of buildings for vigorous and sound criticism.Keywords: immersive environments, architecture criticism, architectural education, occupant-centric evaluation, pre-occupancy evaluation
Procedia PDF Downloads 135907 Sustainable Technology and the Production of Housing
Authors: S. Arias
Abstract:
New housing developments and the technological changes that this implies, adapt the styles of living of its residents, as well as new family structures and forms of work due to the particular needs of a specific group of people which involves different techniques of dealing with, organize, equip and use a particular territory. Currently, own their own space is increasingly important and the cities are faced with the challenge of providing the opportunity for such demands, as well as energy, water and waste removal necessary in the process of construction and occupation of new human settlements. Until the day of today, not has failed to give full response to these demands and needs, resulting in cities that grow without control, badly used land, avenues and congested streets. Buildings and dwellings have an important impact on the environment and on the health of the people, therefore environmental quality associated with the comfort of humans to the sustainable development of natural resources. Applied to architecture, this concept involves the incorporation of new technologies in all the constructive process of a dwelling, changing customs of developers and users, what must be a greater effort in planning energy savings and thus reducing the emissions Greenhouse Gases (GHG) depending on the geographical location where it is planned to develop. Since the techniques of occupation of the territory are not the same everywhere, must take into account that these depend on the geographical, social, political, economic and climatic-environmental circumstances of place, which in modified according to the degree of development reached. In the analysis that must be undertaken to check the degree of sustainability of the place, it is necessary to make estimates of the energy used in artificial air conditioning and lighting. In the same way is required to diagnose the availability and distribution of the water resources used for hygiene and for the cooling of artificially air-conditioned spaces, as well as the waste resulting from these technological processes. Based on the results obtained through the different stages of the analysis, it is possible to perform an energy audit in the process of proposing recommendations of sustainability in architectural spaces in search of energy saving, rational use of water and natural resources optimization. The above can be carried out through the development of a sustainable building code in develop technical recommendations to the regional characteristics of each study site. These codes would seek to build bases to promote a building regulations applicable to new human settlements looking for is generated at the same time quality, protection and safety in them. This building regulation must be consistent with other regulations both national and municipal and State, such as the laws of human settlements, urban development and zoning regulations.Keywords: building regulations, housing, sustainability, technology
Procedia PDF Downloads 347906 Limiting Freedom of Expression to Fight Radicalization: The 'Silencing' of Terrorists Does Not Always Allow Rights to 'Speak Loudly'
Authors: Arianna Vedaschi
Abstract:
This paper addresses the relationship between freedom of expression, national security and radicalization. Is it still possible to talk about a balance between the first two elements? Or, due to the intrusion of the third, is it more appropriate to consider freedom of expression as “permanently disfigured” by securitarian concerns? In this study, both the legislative and the judicial level are taken into account and the comparative method is employed in order to provide the reader with a complete framework of relevant issues and a workable set of solutions. The analysis moves from the finding according to which the tension between free speech and national security has become a major issue in democratic countries, whose very essence is continuously endangered by the ever-changing and multi-faceted threat of international terrorism. In particular, a change in terrorist groups’ recruiting pattern, attracting more and more people by way of a cutting-edge communicative strategy, often employing sophisticated technology as a radicalization tool, has called on law-makers to modify their approach to dangerous speech. While traditional constitutional and criminal law used to punish speech only if it explicitly and directly incited the commission of a criminal action (“cause-effect” model), so-called glorification offences – punishing mere ideological support for terrorism, often on the web – are becoming commonplace in the comparative scenario. Although this is direct, and even somehow understandable, consequence of the impending terrorist menace, this research shows many problematic issues connected to such a preventive approach. First, from a predominantly theoretical point of view, this trend negatively impacts on the already blurred line between permissible and prohibited speech. Second, from a pragmatic point of view, such legislative tools are not always suitable to keep up with ongoing developments of both terrorist groups and their use of technology. In other words, there is a risk that such measures become outdated even before their application. Indeed, it seems hard to still talk about a proper balance: what was previously clearly perceived as a balancing of values (freedom of speech v. public security) has turned, in many cases, into a hierarchy with security at its apex. In light of these findings, this paper concludes that such a complex issue would perhaps be better dealt with through a combination of policies: not only criminalizing ‘terrorist speech,’ which should be relegated to a last resort tool, but acting at an even earlier stage, i.e., trying to prevent dangerous speech itself. This might be done by promoting social cohesion and the inclusion of minorities, so as to reduce the probability of people considering terrorist groups as a “viable option” to deal with the lack of identification within their social contexts.Keywords: radicalization, free speech, international terrorism, national security
Procedia PDF Downloads 199905 External Noise Distillation in Quantum Holography with Undetected Light
Authors: Sebastian Töpfer, Jorge Fuenzalida, Marta Gilaberte Basset, Juan P. Torres, Markus Gräfe
Abstract:
This work presents an experimental and theoretical study about the noise resilience of quantum holography with undetected photons. Quantum imaging has become an important research topic in the recent years after its first publication in 2014. Following this research, advances towards different spectral ranges in detection and different optical geometries have been made. Especially an interest in the field of near infrared to mid infrared measurements has developed, because of the unique characteristic, that allows to sample a probe with photons in a different wavelength than the photons arriving at the detector. This promising effect can be used for medical applications, to measure in the so-called molecule fingerprint region, while using broadly available detectors for the visible spectral range. Further advance the development of quantum imaging methods have been made by new measurement and detection schemes. One of which is quantum holography with undetected light. It combines digital phase shifting holography with quantum imaging to extent the obtainable sample information, by measuring not only the object transmission, but also its influence on the phase shift experienced by the transmitted light. This work will present extended research for the quantum holography with undetected light scheme regarding the influence of external noise. It is shown experimentally and theoretically that the samples information can still be at noise levels of 250 times higher than the signal level, because of its information being transmitted by the interferometric pattern. A detailed theoretic explanation is also provided.Keywords: distillation, quantum holography, quantum imaging, quantum metrology
Procedia PDF Downloads 78904 Next-Gen Solutions: How Generative AI Will Reshape Businesses
Authors: Aishwarya Rai
Abstract:
This study explores the transformative influence of generative AI on startups, businesses, and industries. We will explore how large businesses can benefit in the area of customer operations, where AI-powered chatbots can improve self-service and agent effectiveness, greatly increasing efficiency. In marketing and sales, generative AI could transform businesses by automating content development, data utilization, and personalization, resulting in a substantial increase in marketing and sales productivity. In software engineering-focused startups, generative AI can streamline activities, significantly impacting coding processes and work experiences. It can be extremely useful in product R&D for market analysis, virtual design, simulations, and test preparation, altering old workflows and increasing efficiency. Zooming into the retail and CPG industry, industry findings suggest a 1-2% increase in annual revenues, equating to $400 billion to $660 billion. By automating customer service, marketing, sales, and supply chain management, generative AI can streamline operations, optimizing personalized offerings and presenting itself as a disruptive force. While celebrating economic potential, we acknowledge challenges like external inference and adversarial attacks. Human involvement remains crucial for quality control and security in the era of generative AI-driven transformative innovation. This talk provides a comprehensive exploration of generative AI's pivotal role in reshaping businesses, recognizing its strategic impact on customer interactions, productivity, and operational efficiency.Keywords: generative AI, digital transformation, LLM, artificial intelligence, startups, businesses
Procedia PDF Downloads 78903 Using Coupled Oscillators for Implementing Frequency Diverse Array
Authors: Maryam Hasheminasab, Ahmed Cheldavi, Ahmed Kishk
Abstract:
Frequency-diverse arrays (FDAs) have garnered significant attention from researchers due to their ability to combine frequency diversity with the inherent spatial diversity of an array. The introduction of frequency diversity in FDAs enables the generation of auto-scanning patterns that are range-dependent, which can have advantageous applications in communication and radar systems. However, the main challenge in implementing FDAs lies in determining the technique for distributing frequencies among the array elements. One approach to address this challenge is by utilizing coupled oscillators, which are a technique commonly employed in active microwave theory. Nevertheless, the limited stability range of coupled oscillators poses another obstacle to effectively utilizing this technique. In this paper, we explore the possibility of employing a coupled oscillator array in the mode lock state (MLS) for implementing frequency distribution in FDAs. Additionally, we propose and simulate the use of a digital phase-locked loop (DPLL) as a backup technique to stabilize the oscillators. Through simulations, we validate the functionality of this technique. This technique holds great promise for advancing the implementation of phased arrays and overcoming current scan rate and phase shifter limitations, especially in millimeter wave frequencies.Keywords: angle-changing rate, auto scanning beam, pull-in range, hold-in range, locking range, mode locked state, frequency locked state
Procedia PDF Downloads 87902 Digital Technologies in Cultural Entrepreneurial Practice in Tech Arts in Morocco: Design or Fine Arts
Authors: Hiba Taim
Abstract:
This abstract falls within the scope of entrepreneurship and regulates cultural and creative entrepreneurship. It tackles the topic of "The Ecosystem in Cultural and Creative Entrepreneurship in North Africa". This piece of work deals with the problem of the absence of the ecosystem in cultural and creative enterprises in North Africa, meaning the absence of a clear structure of the ecosystem in the field of cultural and creative entrepreneurship in North Africa. The aim of this research is to create an integrated ecosystem that brings together all those involved in cultural and creative entrepreneurship in North Africa: from training, financial support, continuing, international organizations, government banks, and means of communication. This study is significant not only because it suggests some activities to develop this system but also because it provides all of the information to cultural and creative entrepreneurs in order for them to create project opportunities and activate the entrepreneurship process. It will also enable the creation of opportunities to work among them and formulate common cultural policies to develop the quality of cultural and creative services in North Africa. This research paper uses a qualitative approach to gather information of good quality about the problem being tackled, as well as studying and analyzing different documents and conducting interviews with cultural entrepreneurs, which will help to collect all the information on the state of the ecosystem in North Africa. For the moment, this paperwork is at the stage of collecting preliminary data regarding the problem and developing appropriate schedules for all the phases of the research in order to be productive and deliver this study in the coming months.Keywords: cultural innovation, design innovation, design thinking, cultural entrepreneurship
Procedia PDF Downloads 147901 Investigation of Fluid-Structure-Seabed Interaction of Gravity Anchor under Liquefaction and Scour
Authors: Vinay Kumar Vanjakula, Frank Adam, Nils Goseberg, Christian Windt
Abstract:
When a structure is installed on a seabed, the presence of the structure will influence the flow field around it. The changes in the flow field include, formation of vortices, turbulence generation, waves or currents flow breaking and pressure differentials around the seabed sediment. These changes allow the local seabed sediment to be carried off and results in Scour (erosion). These are a threat to the structure's stability. In recent decades, rapid developments of research work and the knowledge of scour On fixed structures (bridges and Monopiles) in rivers and oceans has been carried out, and very limited research work on scour and liquefaction for gravity anchors, particularly for floating Tension Leg Platform (TLP) substructures. Due to its importance and need for enhancement of knowledge in scour and liquefaction around marine structures, the MarTERA funded a three-year (2020-2023) research program called NuLIMAS (Numerical Modeling of Liquefaction Around Marine Structures). It’s a group consists of European institutions (Universities, laboratories, and consulting companies). The objective of this study is to build a numerical model that replicates the reality, which indeed helps to simulate (predict) underwater flow conditions and to study different marine scour and Liquefication situations. It helps to design a heavyweight anchor for the TLP substructure and to minimize the time and expenditure on experiments. And also, the achieved results and the numerical model will be a basis for the development of other design and concepts For marine structures. The Computational Fluid Dynamics (CFD) numerical model will build in OpenFOAM. A conceptual design of heavyweight anchor for TLP substructure is designed through taking considerations of available state-of-the-art knowledge on scour and Liquefication concepts and references to Previous existing designs. These conceptual designs are validated with the available similar experimental benchmark data and also with the CFD numerical benchmark standards (CFD quality assurance study). CFD optimization model/tool is designed as to minimize the effect of fluid flow, scour, and Liquefication. A parameterized model is also developed to automate the calculation process to reduce user interactions. The parameters such as anchor Lowering Process, flow optimized outer contours, seabed interaction study, and FSSI (Fluid-Structure-Seabed Interactions) are investigated and used to carve the model as to build an optimized anchor.Keywords: gravity anchor, liquefaction, scour, computational fluid dynamics
Procedia PDF Downloads 144900 Ubiquitous Learning Environments in Higher Education: A Scoping Literature Review
Authors: Mari A. Virtanen, Elina Haavisto, Eeva Liikanen, Maria Kääriäinen
Abstract:
Ubiquitous learning and the use of ubiquitous learning environments herald a new era in higher education. Ubiquitous environments fuse together authentic learning situations and digital learning spaces where students can seamlessly immerse themselves into the learning process. Definitions of ubiquitous learning are wide and vary in the previous literature and learning environments are not systemically described. The aim of this scoping review was to identify the criteria and the use of ubiquitous learning environments in higher education contexts. The objective was to provide a clear scope and a wide view for this research area. The original studies were collected from nine electronic databases. Seven publications in total were defined as eligible and included in the final review. An inductive content analysis was used for the data analysis. The reviewed publications described the use of ubiquitous learning environments (ULE) in higher education. Components, contents and outcomes varied between studies, but there were also many similarities. In these studies, the concept of ubiquitousness was defined as context-awareness, embeddedness, content-personalization, location-based, interactivity and flexibility and these were supported by using smart devices, wireless networks and sensing technologies. Contents varied between studies and were customized to specific uses. Measured outcomes in these studies were focused on multiple aspects as learning effectiveness, cost-effectiveness, satisfaction, and usefulness. This study provides a clear scope for ULE used in higher education. It also raises the need for transparent development and publication processes, and for practical implications of ubiquitous learning environments.Keywords: higher education, learning environment, scoping review, ubiquitous learning, u-learning
Procedia PDF Downloads 266899 The Effects of Computer Game-Based Pedagogy on Graduate Students Statistics Performance
Authors: Clement Yeboah, Eva Laryea
Abstract:
A pretest-posttest within subjects experimental design was employed to examine the effects of a computerized basic statistics learning game on achievement and statistics-related anxiety of students enrolled in introductory graduate statistics course. Participants (N = 34) were graduate students in a variety of programs at state-funded research university in the Southeast United States. We analyzed pre-test posttest differences using paired samples t-tests for achievement and for statistics anxiety. The results of the t-test for knowledge in statistics were found to be statistically significant, indicating significant mean gains for statistical knowledge as a function of the game-based intervention. Likewise, the results of the t-test for statistics-related anxiety were also statistically significant, indicating a decrease in anxiety from pretest to posttest. The implications of the present study are significant for both teachers and students. For teachers, using computer games developed by the researchers can help to create a more dynamic and engaging classroom environment, as well as improve student learning outcomes. For students, playing these educational games can help to develop important skills such as problem solving, critical thinking, and collaboration. Students can develop an interest in the subject matter and spend quality time to learn the course as they play the game without knowing that they are even learning the presupposed hard course. The future directions of the present study are promising as technology continues to advance and become more widely available. Some potential future developments include the integration of virtual and augmented reality into educational games, the use of machine learning and artificial intelligence to create personalized learning experiences, and the development of new and innovative game-based assessment tools. It is also important to consider the ethical implications of computer game-based pedagogy, such as the potential for games to perpetuate harmful stereotypes and biases. As the field continues to evolve, it will be crucial to address these issues and work towards creating inclusive and equitable learning experiences for all students. This study has the potential to revolutionize the way basic statistics graduate students learn and offers exciting opportunities for future development and research. It is an important area of inquiry for educators, researchers, and policymakers and will continue to be a dynamic and rapidly evolving field for years to come.Keywords: pretest-posttest within subjects, computer game-based learning, statistics achievement, statistics anxiety
Procedia PDF Downloads 78898 Informed Urban Design: Minimizing Urban Heat Island Intensity via Stochastic Optimization
Authors: Luis Guilherme Resende Santos, Ido Nevat, Leslie Norford
Abstract:
The Urban Heat Island (UHI) is characterized by increased air temperatures in urban areas compared to undeveloped rural surrounding environments. With urbanization and densification, the intensity of UHI increases, bringing negative impacts on livability, health and economy. In order to reduce those effects, it is required to take into consideration design factors when planning future developments. Given design constraints such as population size and availability of area for development, non-trivial decisions regarding the buildings’ dimensions and their spatial distribution are required. We develop a framework for optimization of urban design in order to jointly minimize UHI intensity and buildings’ energy consumption. First, the design constraints are defined according to spatial and population limits in order to establish realistic boundaries that would be applicable in real life decisions. Second, the tools Urban Weather Generator (UWG) and EnergyPlus are used to generate outputs of UHI intensity and total buildings’ energy consumption, respectively. Those outputs are changed based on a set of variable inputs related to urban morphology aspects, such as building height, urban canyon width and population density. Lastly, an optimization problem is cast where the utility function quantifies the performance of each design candidate (e.g. minimizing a linear combination of UHI and energy consumption), and a set of constraints to be met is set. Solving this optimization problem is difficult, since there is no simple analytic form which represents the UWG and EnergyPlus models. We therefore cannot use any direct optimization techniques, but instead, develop an indirect “black box” optimization algorithm. To this end we develop a solution that is based on stochastic optimization method, known as the Cross Entropy method (CEM). The CEM translates the deterministic optimization problem into an associated stochastic optimization problem which is simple to solve analytically. We illustrate our model on a typical residential area in Singapore. Due to fast growth in population and built area and land availability generated by land reclamation, urban planning decisions are of the most importance for the country. Furthermore, the hot and humid climate in the country raises the concern for the impact of UHI. The problem presented is highly relevant to early urban design stages and the objective of such framework is to guide decision makers and assist them to include and evaluate urban microclimate and energy aspects in the process of urban planning.Keywords: building energy consumption, stochastic optimization, urban design, urban heat island, urban weather generator
Procedia PDF Downloads 134897 The Internet of Things: A Survey of Authentication Mechanisms, and Protocols, for the Shifting Paradigm of Communicating, Entities
Authors: Nazli Hardy
Abstract:
Multidisciplinary application of computer science, interactive database-driven web application, the Internet of Things (IoT) represents a digital ecosystem that has pervasive technological, social, and economic, impact on the human population. It is a long-term technology, and its development is built around the connection of everyday objects, to the Internet. It is estimated that by 2020, with billions of people connected to the Internet, the number of connected devices will exceed 50 billion, and thus IoT represents a paradigm shift in in our current interconnected ecosystem, a communication shift that will unavoidably affect people, businesses, consumers, clients, employees. By nature, in order to provide a cohesive and integrated service, connected devices need to collect, aggregate, store, mine, process personal and personalized data on individuals and corporations in a variety of contexts and environments. A significant factor in this paradigm shift is the necessity for secure and appropriate transmission, processing and storage of the data. Thus, while benefits of the applications appear to be boundless, these same opportunities are bounded by concerns such as trust, privacy, security, loss of control, and related issues. This poster and presentation look at a multi-factor authentication (MFA) mechanisms that need to change from the login-password tuple to an Identity and Access Management (IAM) model, to the more cohesive to Identity Relationship Management (IRM) standard. It also compares and contrasts messaging protocols that are appropriate for the IoT ecosystem.Keywords: Internet of Things (IoT), authentication, protocols, survey
Procedia PDF Downloads 300896 Causality between Stock Indices and Cryptocurrencies during the Russia-Ukraine War
Authors: Nidhal Mgadmi, Abdelhafidh Othmani
Abstract:
This article examines the causal relationship between stock indices and cryptocurrencies during the current war between Russia and Ukraine. The econometric investigation runs from February 24, 2022, to April 12, 2023, focusing on seven stock market indices (S&P500, DAX, CAC40, Nikkei, TSX, MOEX, and PFTS) and seven cryptocurrencies (Bitcoin, Ethereum, Litcoin, Dash, Ripple, DigiByte and XEM). In this article, we try to understand how investors react to fluctuations in financial assets to seek safe havens in cryptocurrencies. We used dynamic causality to detect a possible causal relationship in the short term and seven models to estimate the long-term relationship between cryptocurrencies and financial assets. The causal relationship between financial market indexes and cryptocurrency coins in the short run indicates that three famous cryptocurrencies (BITCOIN, ETHEREUM, RIPPLE) and the two digital assets with minor popularity (XEM, Digibyte) are impacted by the German, Russian, and Ukrainian stock markets. In the long run, we found a positive and significate effect of the American, Canadian, French, and Ukrainian stock market indexes on Bitcoin. Thus, the stability of the traditional financial markets during the current war period can be explained on the one hand by investors’ fears of an unstable business climate, and on the other hand, by speculators’ sentiment towards new electronic products, which are perceived as hedging instruments and a safe haven in the face of the conflict between Ukraine and Russia.Keywords: causality, stock indices, cryptocurrency, war, Russia, Ukraine
Procedia PDF Downloads 68895 'I'm in a Very Safe Place': Webcam Sex Workers in Aotearoa, New Zealand and Their Perceptions of Danger and Risk
Authors: Madeline V. Henry
Abstract:
Sex work is a contested subject in academia. Many authors now argue that the practice should be recognized as a legitimate and rationally chosen form of labor, and that decriminalization is necessary to ensure the safety of sex workers and reduce their stigmatization. However, a prevailing argument remains that the work is inherently violent and oppressive and that all sex workers are directly or indirectly coerced into participating in the industry. This argument has been complicated by the recent proliferation of computer-mediated technologies that allow people to conduct sex work without the need to be physically co-present with customers or pimps. One example of this is the practice of ‘camming’, wherein ‘webcam models’ stream themselves stripping and/or performing autoerotic stimulation in an online chat-room for payment. In this presentation, interviews with eight ‘camgirls’ (aged 22-34) will be discussed. Their talk has been analyzed using Foucauldian discourse analysis, focusing on common discursive threads in relation to the work and their subjectivities. It was found that the participants demonstrated appreciation for the lack of physical danger they were in, but emphasized the unique and significant dangers of online-based sex work (their images and videos being recorded and shared without their consent, for example). Participants also argued that their largest concerns were based around stigma, which they claimed remained prevalent despite the decriminalized legal model in Aotearoa/New Zealand (which has been in place for over 14 years). Overall, this project seeks to challenge commonplace academic approaches to sex work, adding further research to support sex workers’ rights and highlighting new issues to consider in a digital environment.Keywords: camming, sex work, stigma, risk
Procedia PDF Downloads 156894 Monte Carlo Simulation of X-Ray Spectra in Diagnostic Radiology and Mammography Using MCNP4C
Authors: Sahar Heidary, Ramin Ghasemi Shayan
Abstract:
The overall goal Monte Carlo N-atom radioactivity transference PC program (MCNP4C) was done for the regeneration of x-ray groups in diagnostic radiology and mammography. The electrons were transported till they slow down and stopover in the target. Both bremsstrahlung and characteristic x-ray creation were measured in this study. In this issue, the x-ray spectra forecast by several computational models recycled in the diagnostic radiology and mammography energy kind have been calculated by appraisal with dignified spectra and their outcome on the scheming of absorbed dose and effective dose (ED) told to the adult ORNL hermaphroditic phantom quantified. This comprises practical models (TASMIP and MASMIP), semi-practical models (X-rayb&m, X-raytbc, XCOMP, IPEM, Tucker et al., and Blough et al.), and Monte Carlo modeling (EGS4, ITS3.0, and MCNP4C). Images got consuming synchrotron radiation (SR) and both screen-film and the CR system were related with images of the similar trials attained with digital mammography equipment. In sight of the worthy feature of the effects gained, the CR system was used in two mammographic inspections with SR. For separately mammography unit, the capability acquiesced bilateral mediolateral oblique (MLO) and craniocaudal(CC) mammograms attained in a woman with fatty breasts and a woman with dense breasts. Referees planned the common groups and definite absences that managed to a choice to miscarry the part that formed the scientific imaginings.Keywords: mammography, monte carlo, effective dose, radiology
Procedia PDF Downloads 131893 Accuracy of Autonomy Navigation of Unmanned Aircraft Systems through Imagery
Authors: Sidney A. Lima, Hermann J. H. Kux, Elcio H. Shiguemori
Abstract:
The Unmanned Aircraft Systems (UAS) usually navigate through the Global Navigation Satellite System (GNSS) associated with an Inertial Navigation System (INS). However, GNSS can have its accuracy degraded at any time or even turn off the signal of GNSS. In addition, there is the possibility of malicious interferences, known as jamming. Therefore, the image navigation system can solve the autonomy problem, because if the GNSS is disabled or degraded, the image navigation system would continue to provide coordinate information for the INS, allowing the autonomy of the system. This work aims to evaluate the accuracy of the positioning though photogrammetry concepts. The methodology uses orthophotos and Digital Surface Models (DSM) as a reference to represent the object space and photograph obtained during the flight to represent the image space. For the calculation of the coordinates of the perspective center and camera attitudes, it is necessary to know the coordinates of homologous points in the object space (orthophoto coordinates and DSM altitude) and image space (column and line of the photograph). So if it is possible to automatically identify in real time the homologous points the coordinates and attitudes can be calculated whit their respective accuracies. With the methodology applied in this work, it is possible to verify maximum errors in the order of 0.5 m in the positioning and 0.6º in the attitude of the camera, so the navigation through the image can reach values equal to or higher than the GNSS receivers without differential correction. Therefore, navigating through the image is a good alternative to enable autonomous navigation.Keywords: autonomy, navigation, security, photogrammetry, remote sensing, spatial resection, UAS
Procedia PDF Downloads 192892 Combat Capability Improvement Using Sleep Analysis
Authors: Gabriela Kloudova, Miloslav Stehlik, Peter Sos
Abstract:
The quality of sleep can affect combat performance where the vigilance, accuracy and reaction time are a decisive factor. In the present study, airborne and special units are measured on duty using actigraphy fingerprint scoring algorithm and QEEG (quantitative EEG). Actigraphic variables of interest will be: mean nightly sleep duration, mean napping duration, mean 24-h sleep duration, mean sleep latency, mean sleep maintenance efficiency, mean sleep fragmentation index, mean sleep onset time, mean sleep offset time and mean midpoint time. In an attempt to determine the individual somnotype of each subject, the data like sleep pattern, chronotype (morning and evening lateness), biological need for sleep (daytime and anytime sleepability) and trototype (daytime and anytime wakeability) will be extracted. Subsequently, a series of recommendations will be included in the training plan based on daily routine, timing of the day and night activities, duration of sleep and the number of sleeping blocks in a defined time. The aim of these modifications in the training plan is to reduce day-time sleepiness, improve vigilance, attention, accuracy, speed of the conducted tasks and to optimize energy supplies. Regular improvement of the training supposed to have long-term neurobiological consequences including neuronal activity changes measured by QEEG. Subsequently, that should enhance cognitive functioning in subjects assessed by the digital cognitive test batteries and improve their overall performance.Keywords: sleep quality, combat performance, actigraph, somnotype
Procedia PDF Downloads 171891 Spatial-Temporal Clustering Characteristics of Dengue in the Northern Region of Sri Lanka, 2010-2013
Authors: Sumiko Anno, Keiji Imaoka, Takeo Tadono, Tamotsu Igarashi, Subramaniam Sivaganesh, Selvam Kannathasan, Vaithehi Kumaran, Sinnathamby Noble Surendran
Abstract:
Dengue outbreaks are affected by biological, ecological, socio-economic and demographic factors that vary over time and space. These factors have been examined separately and still require systematic clarification. The present study aimed to investigate the spatial-temporal clustering relationships between these factors and dengue outbreaks in the northern region of Sri Lanka. Remote sensing (RS) data gathered from a plurality of satellites were used to develop an index comprising rainfall, humidity and temperature data. RS data gathered by ALOS/AVNIR-2 were used to detect urbanization, and a digital land cover map was used to extract land cover information. Other data on relevant factors and dengue outbreaks were collected through institutions and extant databases. The analyzed RS data and databases were integrated into geographic information systems, enabling temporal analysis, spatial statistical analysis and space-time clustering analysis. Our present results showed that increases in the number of the combination of ecological factor and socio-economic and demographic factors with above the average or the presence contribute to significantly high rates of space-time dengue clusters.Keywords: ALOS/AVNIR-2, dengue, space-time clustering analysis, Sri Lanka
Procedia PDF Downloads 479890 Litigating Innocence in the Era of Forensic Law: The Problem of Wrongful Convictions in the Absence of Effective Post-Conviction Remedies in South Africa
Authors: Tapiwa Shumba
Abstract:
The right to fairness and access to appeals and reviews enshrined under the South African Constitution seeks to ensure that justice is served. In essence, the constitution and the law have put in place mechanisms to ensure that a miscarriage of justice through wrongful convictions does not occur. However, once convicted and sentenced on appeal the procedural safeguards seem to resign as if to say, the accused has met his fate. The challenge with this construction is that even within an ideally perfect legal system wrongful convictions would still occur. Therefore, it is not so much of the failings of a legal system that demand attention but mechanisms to redress the results of such failings where evidence becomes available that a wrongful conviction occurred. In this context, this paper looks at the South African criminal procedural mechanisms for litigating innocence post-conviction. The discussion focuses on the role of section 327 of the South African Criminal Procedure Act and its apparent shortcomings in providing an avenue for victims of miscarriages to litigate their innocence by adducing new evidence at any stage during their wrongful incarceration. By looking at developments in other jurisdiction such as the United Kingdom, where South African criminal procedure draws much of its history, and the North Carolina example which in itself was inspired by the UK Criminal Cases Review Commission, this paper is able to make comparisons and draw invaluable lessons for the South African criminal justice system. Lessons from these foreign jurisdictions show that South African post-conviction criminal procedures need reform in line with constitutional values of human dignity, equality before the law, openness and transparency. The paper proposes an independent review of the current processes to assess the current post-conviction procedures under section 327. The review must look into the effectiveness of the current system and how it can be improved in line with new substantive legal provisions creating access to DNA evidence for post-conviction exonerations. Although the UK CCRC body should not be slavishly followed, its operations and the process leading to its establishment certainly provide a good point of reference and invaluable lessons for the South African criminal justice system seeing that South African law on this aspect has generally followed the English approach except that current provisions under section 327 are a mirror of the discredited system of the UK’s previous dispensation. A new independent mechanism that treats innocent victims of the criminal justice system with dignity away from the current political process is proposed to enable the South African criminal justice to benefit fully from recent and upcoming advances in science and technology.Keywords: innocence, forensic law, post-conviction remedies, South African criminal justice system, wrongful conviction
Procedia PDF Downloads 236889 The Review of Permanent Downhole Monitoring System
Abstract:
With the increasingly difficult development and operating environment of exploration, there are many new challenges and difficulties in developing and exploiting oil and gas resources. These include the ability to dynamically monitor wells and provide data and assurance for the completion and production of high-cost and complex wells. A key technology in providing these assurances and maximizing oilfield profitability is real-time permanent reservoir monitoring. The emergence of optical fiber sensing systems has gradually begun to replace traditional electronic systems. Traditional temperature sensors can only achieve single-point temperature monitoring, but fiber optic sensing systems based on the Bragg grating principle have a high level of reliability, accuracy, stability, and resolution, enabling cost-effective monitoring, which can be done in real-time, anytime, and without well intervention. Continuous data acquisition is performed along the entire wellbore. The integrated package with the downhole pressure gauge, packer, and surface system can also realize real-time dynamic monitoring of the pressure in some sections of the downhole, avoiding oil well intervention and eliminating the production delay and operational risks of conventional surveys. Real-time information obtained through permanent optical fibers can also provide critical reservoir monitoring data for production and recovery optimization.Keywords: PDHM, optical fiber, coiled tubing, photoelectric composite cable, digital-oilfield
Procedia PDF Downloads 79888 Green Organic Chemistry, a New Paradigm in Pharmaceutical Sciences
Authors: Pesaru Vigneshwar Reddy, Parvathaneni Pavan
Abstract:
Green organic chemistry which is the latest and one of the most researched topics now-a- days has been in demand since 1990’s. Majority of the research in green organic chemistry chemicals are some of the important starting materials for greater number of major chemical industries. The production of organic chemicals has raw materials (or) reagents for other application is major sector of manufacturing polymers, pharmaceuticals, pesticides, paints, artificial fibers, food additives etc. organic synthesis on a large scale compound to the labratory scale, involves the use of energy, basic chemical ingredients from the petro chemical sectors, catalyst and after the end of the reaction, seperation, purification, storage, packing distribution etc. During these processes there are many problems of health and safety for workers in addition to the environmental problems caused there by use and deposition as waste. Green chemistry with its 12 principles would like to see changes in conventional way that were used for decades to make synthetic organic chemical and the use of less toxic starting materials. Green chemistry would like to increase the efficiency of synthetic methods, to use less toxic solvents, reduce the stage of synthetic routes and minimize waste as far as practically possible. In this way, organic synthesis will be part of the effort for sustainable development Green chemistry is also interested for research and alternatives innovations on many practical aspects of organic synthesis in the university and research labaratory of institutions. By changing the methodologies of organic synthesis, health and safety will be advanced in the small scale laboratory level but also will be extended to the industrial large scale production a process through new techniques. The three key developments in green chemistry include the use of super critical carbondioxide as green solvent, aqueous hydrogen peroxide as an oxidising agent and use of hydrogen in asymmetric synthesis. It also focuses on replacing traditional methods of heating with that of modern methods of heating like microwaves traditions, so that carbon foot print should reduces as far as possible. Another beneficiary of this green chemistry is that it will reduce environmental pollution through the use of less toxic reagents, minimizing of waste and more bio-degradable biproducts. In this present paper some of the basic principles, approaches, and early achievements of green chemistry has a branch of chemistry that studies the laws of passing of chemical reactions is also considered, with the summarization of green chemistry principles. A discussion about E-factor, old and new synthesis of ibuprofen, microwave techniques, and some of the recent advancements also considered.Keywords: energy, e-factor, carbon foot print, micro-wave, sono-chemistry, advancement
Procedia PDF Downloads 307