Everyone. I hope you're enjoying your lunch. On behalf of the research council, I'm very pleased to welcome you to our inaugural research colloquium. I'd like to start my remarks by thanking President Gloria Larson and the Bentley board of of trustees for their ongoing and tremendous support for research at Bentley. [APPLAUSE] This colloquium is brought to you by the research council at the request of my page. Our goals are to stimulate new research at Bentley at the same time that we're celebrating research and also supporting research. There are so many people here that need to be thanked. I won't have time to list everyone, but I do want to mention the research council, the colloquium committee which includes some people outside of the council, obviously our keynote speakers and our keynote listeners, the many individual faculty members and doctoral students whose work will be represented here today. And a veritable army of people who helped provide support behind the scenes of our period of six months for this event. I also like to thank the audience because without you, the rest of us would be talking to ourselves, and that really wouldn't be very much fun at all. So we're really happy to have you here in the audience. There are two individuals that really deserve to be singled out for special thanks, as far as this colloquium goes. The first is Bob Galliers, our university distinguished professor and former provost, who's done so much to put research at Bentley on the map, both internally as a priority, and also externally by letting the world know about the excellence in quality of Bentley research. The second one, no surprise, is Mike Page, our current Provost who has established the research council and given us the charge of advancing the teacher-scholar model at Bentley. This colloquium was his idea and, without his leadership, this would never have occurred. Well, our program today was timed to occur in conjunction with the Bentley board of trustees meeting. And there were two reasons for that. First, the trustees are a key stakeholder group for research at Bentley. It's very important for us to take this opportunity to showcase what we're doing and ensure that we have their ongoing support for what we're doing. The second is that we wanted to take advantage of the presence here of Bentley trustee, Jeroen van den Hoven, who is a leading authority on responsible innovation. His research in responsible innovation inspired the theme for this colloquium and, in fact, also inspired the theme for the event that preceded this last night. The event that was sponsored by the Center for the Integration of Science and Industry which was also a fascinating program. We are very fortunate to have our closing speaker as Roger Simnett who's been visiting us from the University of New South Wales. He too is an internationally known expert in this case on corporate reporting, and he will be talking about the responsibility, social responsibility issues related to corporate reporting. Right there with our two keynote speakers, I think you can see that this theme of responsible innovation engages both the business side of our house and the arts and sciences. But if you're not convinced, you will be convinced by the work that follows. We will have 32 Bentley faculty members and doctoral students. And I'm not counting their many collaborators, both here at Bentley and in other universities, who worked with them on the projects that they're reporting here. Ten different departments from across business and the arts and sciences are represented among our speakers today. And I think you'll see that this range of speakers represents the breath and depth of research at Bentley and also showcases the different kinds of scholarship in which we engage. That we're engaged in the scholarship of a discipline-based scholarship and that we're also involved in doing policy research, and that we're also involved in education and curriculum-based research. So our program today does not really allow for a formal wrap up. Well, I think we'll all do that at the reception afterwards. But I did want to take the opportunity to try to wrap up in advance by reminding us all of why we do research at Bentley. And there are really three reasons. First, it's really meaningful and important work. Second it's collaborative social and often interdisciplinary, and the third reason is that it's fun. And I hope you will have a great fun today. At this point, I'd like to turn the mic over to Mike to introduce our keynote, first keynote speaker. Thank you. >> [APPLAUSE] >> [APPLAUSE] >> Thanks very much, Lynn. I guess what I have to add to all those things that you put out is, of course, the thanks to you for the work you've done. [INAUDIBLE] >> [APPLAUSE] >> It is indeed a pleasure for me to introduce Jeroen van den Hoven to you this afternoon. On behalf of all of us, to also thank him for agreeing to be the opening keynote speaker at this our inaugural research council research colloquium. Jeroen is a professor of moral philosophy at Delft University of Technology in the Netherlands and Vice Dean of the Faculty of Technology Policy and Management. He's also a scientific director of the three technical universities in the Netherlands and editor-in-chief of the Ethics and Information Technology. I did indicate to Jeroen that I would be keeping my introduction very short to leave as much time as possible for him. But in reality, in going through his CV in greater depth, it will just reinforce my own insecurities. So I'm not going to do that. Jeroen's personal research achievements and his research leadership are truly amazing. He collaborates globally. He has received numerous grants from the Dutch Research Council. And if I remember correctly from the corporate sector as well, he's been advisor to the Dutch government in various roles and he received the world technology award for 2009. I'm sure that Lynn will be talking later, probably even more as we develop this colloquium, how yJeroen's support for the colloquium has extended well beyond agreeing to be the keynote speaker. A number of our colleagues I think visited him in and went to his conference there as a precursor to designing this event. So Yurin, thank you for that support. I'm also aware that your renters reached out to a number of you with offers of collaboration. And with the desire that he's expressed, and he's I think also expressed to our trustees for Bentley to engage in some of the international networks that he is part of. And then I suspect more than being part of, that he's actually the hub of. Yurin is married with two kids, to change track. One of them is a son who I think is still studying medicine. And his daughter, is she still at the [FOREIGN] or she graduated up? >> Moved on. >> She's moved on, so she's also at college now. And then, of course, he has one strength and one weakness. The strength is, for those of you who are like minded, he plays the classical guitar. And the weakness is from my perspective, that he runs 40 kilometers a week. >> [LAUGH] >> Are there no limits to my insecurities? >> [LAUGH] >> Yurin, thanks for being here today and for agreeing to join our board of trustees some years ago. The challenges and opportunities you raise for us. And your insistence that we can be at the sweet spot of one of the critical areas for business in society today, is both inspiring and motivating. We appreciate your willingness to walk down this path with us. Thanks very much. Yurin. >> [APPLAUSE] >> So I should change this? Mike, thanks very much for those friendly words. And thanks for inviting me. It's a yeah, it's a great pleasure as a member of the board of trustees to address you and to also be part of the primary process, the core business of the university. And to contribute to ideas for teaching and also ideas for research. And I was already incredibly impressed Linus kept this secret from me. So it was a big surprise to see all those wonderful projects and so many of you coming together from different parts of the university to team up and do research in what I think is a very promising and also inspiring topic for research, but also for teaching. And this is the idea of responsible innovation which I've been working on together with the Dutch research council but also the Union European Commission. And I will have the opportunity to say a little bit more about it and to hopefully convey why I think this is such a promising theme? So we'll do this in four steps. Talk about the grand challenges or as you would say, call them here the trillion dollar questions or wicked problems. Value sensitive design which I think is the core idea and I've seen many of the projects displayed here talking and speaking to that idea. The problem of moral overload, which I think is a core idea in thinking about innovation from a moral point of view and something about the conditions of responsibility. So, first about the grand challenges. I'll see whether this also works. In a way technology is wonderful, it saves lives and could alter lives in these ways, and we will talk a lot about these applications and the newspapers are full of them. And they will give rise to incredible societal debates whether these are interesting things to do or not. Some of the things are clearly very good to have, there's no or little discussion about them, but other things are not so good to have and we have a lot of discussion about them often after the fact. So it's always appropriate to ask. This is a new technology. This is technology, but is it good? In Europe, this question has been put center stage. We are moving into a new R&D program. It's a ten year program. We've come out with the old one, the FP7 program. And this program in Europe, the Horizon 2020 has, I think, around 80 billion euros for the next ten years. We may be down to 70 as a result of our financial problems, but you have a little bit of financial problems yourself here. But still, a good amount of money for R and- D. And the interesting thing is is that we managed to think about how we would position that type of research [INAUDIBLE] the big societal issues. This idea of an innovation union is in [INAUDIBLE] towards, explicitly towards the ideas and the aims and the objectives that were formulated in the loom's declaration in 2009 under the Swedish Presidency. And these stipulate that R and D should be geared towards the grand challenges and we know of course what this grand challenges are. Clean drinking water, safe food, climate change, overfishing, sustainable energy, waste management. We know them and we recognize the millennium goals which just have been renewed by the UN and again restated. So we will have a lot of debate about SHALE GAS in Europe more so perhaps than the United States, but more will perhaps be coming. We would be talking about emissions and climate change. We will be talking about IT and health. We will be talking about synthetic biology and nanomedicine. And in Europe, as I said, this is very much geared towards the grand challenges of removing from a phase where we're talking about science and society to science in society, and now science for society. And we've made this effectively part of the funding scheme in Europe. A cross cutting theme. Responsible innovation. And some, according to the latest estimates, some 500 million Euros are dedicated to doing research across the board for all disciplines, for all sciences for this topic. I've advised the European Commission on this, as a result of that this has happened but there are more forces that are working and converging towards this direction. In the Netherlands, we have this program and it function as a benchmark and an example for Brussels. It's called the Society Responsible Innovation Program. And we're moving out of phase one which had 15 million Euro associated with it. To phase two where there's again, according to latest estimates again, 50 million Euro available for that. Some people think that this is very innovative. And or this is very innovative. >> [LAUGH] >> Or this is extremely innovative. >> [LAUGH] >> Super innovative. Of course, and I found another example of, it was a price for innovation given for game designers, Dutch game designers. I used Dutch examples, otherwise would be impolite to make fun of. So, ridiculous innovations. This is ridiculous fishing. And I'll leave it to you and to your imagination to figure out how many hours will be wasted on playing this ridiculous fishing game. But this is an example of at least a frivolous innovation. Perhaps the time and energy that went into the creativity that went into coming up with this idea, let alone playing the game, could perhaps be better spent. And so we try to go away, to move away from that, and focus our research energy On the grand challenges. So it's always good to bear in mind, it's always good to ask the question this is an innovation, but is it good? And so that's always a legitimate question and it should be on the plate for the people who come up with the new technology, and therefore need to take responsibility for it. And that is just a legitimate question. This brings us to an important ingredient or a trend in research that tries to incorporate and express and embed public values. Or the values that we share into the technology, and I think that is also from an ethical point of view a very interesting move. We've always left ethics up there, it's the high table. It's a very posh discussion among scholars and people who are in the ivory tower. And this a way bring it down and make it a part of technology and institutional design. And I'll show you how this is done. Here's an example that some of you may know coming from New York. These are Robert Maus's low-hanging overpasses, and some of you know this example. The car has something to do with it, let me see whether this works. So something clearly has gone wrong here. But we know from the study of his work, of Robert Moses' work in life, he was a famous town planner and architect at the beginning of the previous century. He got a lot of big projects. But it turns out, reading his biography and the analysis of his work, that he was a little bit of a racist. And he made these overpasses intentionally low so as to prevent busses to be routed from the poor black neighbourhoods to the white middle class beaches. So people were building these things and they, and even we have that on in writing that people reported. Isn't this lower than what we usually do? Well it was but it was for a purpose. And so if you stand beneath those things then you say it's that low but actually this is a barrier, which is a racist barrier, which prevents and constrains the range of options that future users will have. So in this case, a wrong value is expressed, gets expressed, into the technology and from there on starts to constrain the future users. Well, once you've seen this example, you will recognize many, many more. And I have a couple of them for you. This is one I came across, this is door of humility of the Church of Nativity in Bethlehem Israel. And so it's called the door of humility because you have to kinda bow your head in order to go in. Well if you read the historical analysis of this was done on purpose but as a security technology in order to prevent mounted horsemen to raid the church. So it's a piece of security technology, if you will. This is also an interesting one. So this, for once in a lifetime, was an IT project that was done according to the specifications and requirements, on time, and within budget, you know how rare that is. But anyway it is a real time audio and video link on the helmet of the guys in the fire brigade. And they are connected in this way to an emergency medicine center. And, of course, the idea is that these guys arrive as the first on the scene of the disaster and therefore they could benefit from the advice of the medical staff. Good idea. First test failed completely. Why? Well, it's nothing to do with the smoke, but it has something to do with ethics and values. It became immediately clear that the medical guys were telling people from the fire brigade to put a gel pack on the arm of the boy right, so that's the thing. But the deal from the fire brigade we're complaining and saying, no there are five people in an adjacent building. I have to apply foam over there, right? So although this project, IT project, from a technical point of view was impeccable. It was done in according to the specs and requirements and it went in time with budget. You could never make this work. If you would not have given attention at a very early stage about how these different professional responsibilities and their implications would play out in the use of that. So you need to consult and you need to pay attention to the values for which you are then designing this thing. Everyone knows that this is not a neutral thing it's biased. You can manipulate it in whatever way, the values can be built into the algorithm. And you're familiar, this is the formula that killed Wall Street, very nice paper in Wired Magazine. Then this model of gauging risks of mortgage portfolios went into the software packages of the people that were using them and advising on risks and it went all wrong as we know. We found out after the fact. So values are built into systems. At every level choices are made and they're expressive of underlying values. So the thing that you buy, the thing that you use has certain values embedded into it and it then starts to constrain you in particular ways. And some of them may be trivial, but others may be really important as we saw from the low hanging over pass. It was just kind of discriminate against you. And so I leave it to you, I mean there're so many examples of this. So but values get built into interface, infrastructures, algorithms, ontologies. And we delved, we collate, and we have collections of examples and case studies which show in the way that the other examples that I've put up show, how this is done and how it starts to affect the future users. So the key problem in the 21st century, one of the key, there are more of course, but there's a real key problem and it's the problem of value sensitive design. So this is the world of ethics, of politics, of values, of the things that we really think need to be done and the world needs to comply with, right. And so here are examples. Responsibility, and privacy, and autonomy, and safety and security. And here's the world of engineering, of computer systems, and networks, and protocols, and ridges and systems. And, of course, this is you're conveniently vague. We're very strong feelings about them, but it is very difficult to be precise about it. This, on the other hand, is the world of engineering and technology, and you need to be incredibly precise otherwise it doesn't work. So how do we marry those two worlds? Of course, we want this would be due on the right hand side, to be an expression and an incorporation of the things that we think is important on the left hand side. But PowerPoint allows you to do this in a very easy way but everywhere where I can point my pointer, there is a methodical problem, there is a problem of conceptual problem, how so you do? What does it mean to express to implement these things. How do software engineers do that actually? Once they've done so, how can they justify an audit that they've done it in the right way and the best possible way? This is a real problem. And it's already apparent from some of the projects that are put up here. That this is the thinking underneath and we should be really kind of looking at this type of problem. It will be a problem that is, a recurring problem will be on our table, on our plate Every other day. This is the structure. So we need to learn to design for x, to design for privacy explicitly and be accountable in that respect. How did we do it? Design for security, design for inclusion. Design for transparency and safety and accountability, and human capabilities to put it even more wider perspective. So, learn to design for shared values. This, of course, I mean, slightly overburdening because now, suddenly, we're stuck with all of those values. We have to design for safety, and privacy, and accountability, and all of these things. So, this is the problem of moral overload. We are morally overloaded. And that is, that's a problem. If we get it wrong we may mess up, miss out. This is a way of messing things up in technology, this is a way of messing it up. But I'd like to draw your attention to again, two Dutch cases where we messed it up and missed out on opportunities at the same time. This is the idea that's present everywhere to smartify electricity grids and to bring down CO2 and reach our CO2 reduction targets in Europe before 2020. It was a no-brainer. Everybody was ready to roll, the technology was there, the electricity companies were there, the government was on the same page. So we thought about a smart meter, a little computer that was introduced in every household and took a snapshot of the electricity consumption in the household and sent it off the electricity companies in order to do some peak shaving and just fine tune the electricity supply. So technically no problem, everyone was ready. But then the privacy discussion kicked in in the Netherlands, but also wider in Europe. And after ten years of R&D in development of the smart meter it was rejected in the upper house, right, in parliament. And so it was killed because of privacy constraints and privacy considerations, and we were back to square one, and a lot of money was lost. So, they claimed they did a smart meter, but the meter was not smart enough to incorporate and anticipate which kind of considerations and objections would follow later on in society. If they would have done so they could have used them as requirements for their design and made a really smart meter, one that anticipated and accommodated those problems. And the same for the electronic patient record system. Same thing, same story. 300 million failed innovation. Why, privacy concerns right, rejected the upper house. No serious attention to the privacy issues. If they, again would have used those privacy considerations up front, front loaded the ethics as it were, into the design process, and turned them into requirements for their new systems and services, they would have done a much better job. So, this is the structure of the problem again. We want our sustainability in the case of smart meters and we want our privacy. Think of the same problem that we have discussions, also very prominent in Europe, about camera systems. The same old kind of balancing security in the street versus my privacy of being caught on camera. The structure of the problem is as follows, this is the problem of moral overload and it shouldn't be fatal, so this is the good news, it doesn't need to be fatal. You want your security above a certain threshold level and you want your privacy above certain threshold level. Therefore, your ideal solutions are over there. But you're stuck with a first generation relatively primitive camera system. And that camera system if you introduce it, gives you razor sharp images of the innocent citizens and blurred images of the crooks, right? So you have neither. You had neither security nor privacy. Or you move towards a slightly better next generation 2.0 camera system. And you hang that everywhere and you have a lot of security but no privacy. Or you decide not to hang that everywhere and you have a lot of privacy but no security. Where you want to be of course, you want to have your cake and eat it, you want to square the circle, square the circle. And you want to have a very advanced, fine grained technology that you can configure in such a way that you can have the advantages of the technology, that the technology has to offer, without the drawbacks. And in this direction there is a lot of innovation. Innovation often is actually reconciling antithetical or contrary values that seem to be in conflict, and you come up with a really smart solution. That is what smart often indicates, I've come up with the very clever solution for genuine societal problem. And this is the moral principle that is driving it. If you can change the world by innovation today so that you can satisfy more of your obligations tomorrow, you have a moral obligation to innovate today. And if you do an analysis of some historical examples, you will see that this often is an underlying driver of innovation. Often not voluntarily chosen but as it happens to be. Look at, the rest of the world was shaking its head when Germany in the 70s and the 80s, the Green Party was kind of wreaking havoc and was chaining themselves to every factory they could find. And so everyone wondered, how can you make economic progress, how can you look at economic growth when there is so much resistance in society. So this is the tension, sustainability versus economic growth. Well today, Germany is market leaders, world leader in renewable technology and sustainability technology. Why, well they had to innovate themselves out of the problem. They had to find a solution that would give them both. So that if they would have been able to sweep one of those values under the carpet, pretended as if that wasn't there they would have taken away an important impetus or a very important trigger for innovation. But it's because you make it difficult for yourself because you're ethical, right? You take both values or all of those values serious, then you have to come up with really smart Ideas. Same thing for privacy enhancing technologies. You know, there's always been this tension between the US, I won't go into the recent little things that happened, but you know there has always been this big tension between Europe and over privacy and the US. And people around the world were kind of worrying about what the Europeans, they made such a fuss about privacy. They put it in there to laws and they have the kind of regulatory authorities, and inspection and monitoring etc, and fining. But now in the age of big data, we have developed in the last years in Europe privacy enhancing technologies big time, right, and the rest of the world thinks well, it might perhaps be a good idea. So there is business there, it doesn't need to go at the expense of doing interesting things and creating some value in another sense, economic sense. This is also a very nice example, it's the clipped tag that IBM worked on. It's an RFID tag that's attached to consumer products. You buy them in the shop and it allows them to track you. Within a certain range, all right. But what they thought of was that it has indentations so you can tear off a piece. It has an affordance so it's easy for a citizen or a consumer to see. You can tear off a part of the antenna and therefore shorten the range in which you can be followed. So if it's okay for you to be followed and tracked and traced within the shopping mall, there's no problem. But if you don't feel comfortable with being tracked and traced outside and when you go onto the parking lot, you tear off the larger part of the antenna. So design solutions to value conflicts and solving the problem of moral overload and, at the same time, innovating. As a conclusion, something about responsible innovation. When you think about it, responsible innovation sounds a little bit like a lazy chair. The innovation itself isn't responsible. People are responsible. They're behaving responsible, in a responsible way. So it's like a lazy chair. But a lazy chair is a chair that is not itself lazy, but it accommodates lazy people. If you're lazy, feel lazy, you can sit on it or you pretend to be lazy in a lazy chair, etc. So a responsible innovation is probably referring to a process or procedures or organizations that accommodate responsible behavior, that accommodate people qua responsible agents. So that is a requirement on how you design organizations and the protocols and procedures in order to accommodate them. I have to [INAUDIBLE]. So what we have to be especially on the outlook for is that the way we proceed with these matters doesn't provide us chances to come up with excuses. Like we didn't know what we were doing. Okay, we're reintroducing asbestos, or we're reintroducing nano carbon tubes. And we didn't know what we're doing. We didn't know what the options were. We didn't know what the consequences were. Or we didn't have any time to consider because our time to market, we wanted to be the first, etc. So all these things, of course, in proportion to the things that can go wrong. We have a responsibility to consider and to take into consider before we start to release things into the outside world. I'll skip that, it's a long definition. But I just wanted to draw your attention to the fact that innovation is now no longer exclusively about just frivolous new functionality. Piece of new functionality. This allows me to do something faster, quicker, with more data, etc. That's just functionality. We're also looking at how moral values are affected by that. Whether we accommodate and respect moral values in introducing and designing this. And not only one particular value, our champion value, like sustainability or safety. Because we need to take all of those values into consideration. An interesting example in the Netherlands where a bus exploded. It's a bus that was being designed for sustainability, drives on liquid gas. It had been so sustainable that it exploded. The safety was out of the window. So you need to just not optimize over one moral value, sustainability. But also do the others and take them also into account. So a set of values as constraints for your design. In the fourth level, if you want, is where the expansion of the set of moral obligations that we can satisfy is your major aim to innovate. Think of the innovations which I showed, where you suddenly, instead of one obligation in a moral dilemma, I have to choose between this and that. Now suddenly, you've changed the world in such a way that you can satisfy two of your obligations. So actually, you've gone forward from a moral point of view. You've actually moved, expanded the set of moral obligations that you can satisfy. And that's your aim in innovation, right? That was your first aim. And that helps you to a new concept of innovation, where it is about trying to change the world in such a way that we can satisfy more of our moral obligations. And I think, just to conclude this because I'm running overtime probably, well overtime, is that for Bentley, I think, and I commend you for picking up on this trend and getting all those people, wonderful colleagues together and work on this, is a wonderful opportunity. And I've always, since the first day I've arrived here, I've said you're looking at a golden, I've referred to it as the golden triangle. The golden triangle of IT, ethics, and accountancy and finance. And if you look at the world's problems today many of them are in that golden triangle. And I think that is a unique opportunity. And the way that you've dealt with it and picked up on it already shows that you're very nimble in this respect. So this is what I have to say to you about this responsible innovation thing. >> [APPLAUSE] >> We're to our first keynote listener, Bob Galliers. And we'd like to have the panelists for the first panel come sit over here. And Bill has got the great job of the timer to keep everybody on track. And hopefully, this will keep us to our target. So the plan is we have our first however many speakers this is. And then we have some concluding remarks by Bob. At that point, we are going to disperse around the room and have our panelists stand in front of their posters and interact with the rest of you. So that those of you that are not presenters have the job of going around and interacting with our presenters. So that's the strategy. Let's see how well it works. I've got my fingers crossed, Bob. >> [APPLAUSE] >> Look I'm listening, I'm sorry. Okay, so this panel is examining the theme of this colloquium from the perspective of discipline-based scholarship. Now, a word of warning, however. You'll find that the disciplinary boundaries are somewhat porous. And that's something I should build on in my comments at the end of this. So I'm Bob Galliers. My job is to introduce each speaker as you've just heard. And to provide a summary at the end. No one's given me a hook to get hold of each speaker after their two minutes. But we'll see if we can certainly keep the time. Now it's often said that research is a marathon. We're gonna change the mold today by having a sprint relay. So I'm very grateful to Bill for his watch and his timekeeping. But I should say, too, that there have been a number of thanks in addition to Lynn and colleagues on the research council. We really should thank both Bill Wiggins and Kevin [INAUDIBLE]. Where's Kevin? Is he in here somewhere? For all the background work they've done. So please join me in thanking them. >> [APPLAUSE] >> Okay, I'm gonna stop. And our first speaker is Tom Davis. From natural applied sciences talking about communicating climate change. And Tom that should be you, there you go so just take it away. >> Okay, thank you Bob so this pikle idea came from the European geophysical union meeting in Vienna last April. And a German postdoc we hosted in our department here at Bentley last year. So two of our NSF grant collaborators from Plymouth State University teach in their flagship meteorology program in New Hampshire. And our Bentley undergraduate student on this project also applied to another PSU. Penn state to major in meteorology, but we're thankful she came here instead. The over arching question for us is, do TV weather forecasters have a responsiblity, to tell they're viewing all the instances about climate change? Such as how increasing levels of carbon dioxide that you see in those data down on the right graphic, lower right graphic. Are causing global warming and making extreme storm events, such as Hurricane Sandy, more likely and more severe. So, despite 97% of climate scientist a really strong consensus. At least 30% of TV weathercasters do not accept global warming with many of them claiming climate science is a hoax and a scam. So to explore this dichotomy we are surveying the science backgrounds of weathercasters. We are surveying undergraduate meteorology program curricula. And we're surveying undergrad majors in meteorology programs. And we are phone interviewing TV weathercasters, at least trying to do that. And we're also planning two workshops on climate change for TV weathercasters and meteorology students in the coming year. So one early result is that the large majority of TV weathercasters here in New England have degrees in meteorology. In fact, many of them have graduate degrees. So a lack of science education is not the answer to this problem that we have. It must be something else. So please visit our poster, it's over in this far corner. >> [LAUGH]. >> Thank you. >> [APPLAUSE] >> That's a great start. So now we're on to Tamara. Tamara Viviane from the CIS department, talking about the down sides of Unicorp. >> All right. In the physical world, people and organizations have developed more or less common understanding about what constitutes appropriate sharing of information and how to maintain privacy. Not so in the highly networked digital world. Where as recent studies of social network users have shown most people do not change the default privacy settings. People's actual privacy settings often do not match their stated intentions on data sharing. And even if people have their privacy settings figured out those settings can be circumvented by new things within those environments. Like for example Facebook apps about which users are mostly completely uninformed. So the whole privacy settings thing just doesn't work. So there's one vision which suggests to have personal data banks. Which would store user-owned personal data, and would transact that data with external entities. Like businesses, non profits, social networks, governments On behalf of the user owning the user specified preferences. Achieving this vision requires that we investigate how to develop mechanisms for usable privacy while still allowing an individual to be the sole owner of their data. Will inform him or her about the possible use of that information fully, at the right time, and in an easily digestible form. So these are the questions that I have been pondering lately. And if you are interested in discussing this please come, leave me a message, I will be class during the discussion. >> [LAUGH] [APPLAUSE] >> How about that? An IT project on time and under budget. Amazing, so Ronnie Hotesh from accountancy. Can we trust financial reporters? >> So, as Warren Buffet said, trust is like the air we breathe. When it's present, no one really notices. But when it's absent, everyone does. Trust is basically a psychological state. Where the individual leave themselves vulnerable, and they expect a positive action from another individual. Research is telling us that trust in copulation is highly relative to only one major factor determine information selling Information is shown. Now, in order to produce financial reports, within the organization individuals need to share a lot of information. And therefore we believe, in this study that trust is going to be associated with financial reporting. So trust leads to information sharing and exchange, and then to financial reporting quality. Now we obtained data from a company that is called Great Place to Work. You must be probably familiar with this company most based on its ranking of the Fortune 100 Best Companies to Work For. We basically, something happening but I'll continue. So we basically use the data [INAUDIBLE] In order to create a [INAUDIBLE] level measure of trust. And then we got a statistical model. And we find that trust is associated with increased financial [INAUDIBLE] equality. It's associated with better [INAUDIBLE]. It's associated with less financial [INAUDIBLE] statements. And it's associated with less earning [INAUDIBLE]. So we do find a trust is very meaningful and important. And the implications of our study is that what we recommend that the manager should try to engage in trust within the organization. And then spend greater time and effort in order to improve trust with their employees, and that has the potential to. [INAUDIBLE] quality among other potential benefits. Thank you, I'll be right there in the post session. >> [APPLAUSE] >> I told you Kevin should be thankful he's worked behind the scenes now. He's come up from stage, that's great. So next is Jeff Mariache from Philosophy, and more on privacy in the information age. >> Thank you. My thesis is that improved data collection and aggregation technologies threaten our autonomy, and in two ways. They make it more difficult to do what it is that we want to do, and maybe more importantly, and this is a less noticed feature. They make it more difficult to for us be who we want to be, to present ourselves in a certain way in a public setting. Such as, for example, the ranger over there on the right. Let's understand that informational privacy is can be understood in terms of control and access. A person has informational privacy with respect to a certain Information about her aunt just in case. That person has control over who has access to the information. Privacy is important. From other reasons it's important is because it protects our autonomy. The autonomous person is the author of her own life. She does what it is that she wants to do, and she also controls who she is. With improved data collection and aggregation technologies, more people can know a lot more about you. The result is that you may be tempted to edit your actions, to conform to society's expectations. If you are a unique personality, that may be a problem for you. If you're a square like myself, it's not a big problem but it depends. But more importantly and this is a more novel claim, is that your ability to present an edited version of your life history to others, is going to be compromised by the fact that everything you do is out there. You might not want to tell the guys in your motorcycle gang, I mentioned about your passion for quilting. Both are things that you do, nevertheless you might wanna present a certain face of yourself to the public. So these increased or enhanced collection in aggregation technologies, need to be met with a increased or enhanced respect privacy on the part of many actors. Thank you. [APPLAUSE] >> Ans next is, Joni Segar from Global Studies, again talking about climate change now from gender perspectives. >> Thank you. Thank you very much. The research I'm gonna talk about today comes out of at least 20 years of strenuous effort across disciplinary effort, to bring social science perspectives into the study of things environmental. [COUGH] Which has long been presumed to be the domain of the physical sciences. That's a whole other story that would need a lot more than two minutes. But we have now arrived at the point at least where it's generally agreed, that most environmental problems are caused by humans. And particular social, economic, political arrangements, particular so this is evident with climate change. So if we're going to solve those problems or move towards them, we desperately need a social analysis as well as a physical analysis. And within that tool kit, we absolutely need a gender analysis. And without it, I would argue our efforts to change human systems will fall short and fail, because gender alignments and gender hegemonies are kind of deeply embedded into those very social arrangements. That we want to change or need to change, to solve our environmental problems. Gender analysts in this field have kind of picked the low-hanging fruit of looking at impacts and vulnerabilities to environmental change, which are very clearly gender differentiated. But if we want to move on to really solving problems such as climate change, we need to look at the drivers of climate change. And most recently, three drivers have been identified as particularly important. Who makes energy policy, first world consumption patterns, and cultures of risk, and risk taking. And I think many people in the room, could see immediately how gender analysis would be pertinent to all of those. But let me end with saying, if we look at consumption, and some of the really big consumption of client change. I'm gonna say three words and you figure out where the gender alignment is. Meat, cars and household appliances. [APPLAUSE] >> Continuing on the theme of gender, we now have from finance talking about board diversity, from a gender perspective and CEO selection. >> This is work jointly with, Kartik Raman who was my colleague in the finance department. So the paper is about board diversity and CEO selection. What we're talking about is, we're looking at if board diversity impacts the probability of a female being selected to the CEO position at the firm. So who cares? Why is this interesting? [LAUGH] >> Approximately half the population. 56% of college students and approximately 2% of the CEOs at S&P 1500 funds, so there's just a profoundly, deeply underrepresented in the top ranks of the corporation. So the question comes up, why? Why is this, why did we find this? And we can think about two possibilities right away. One is that for some reason, boards won't appoint them. Boards tend to be heavily male dominated, so there are all sorts of issues that one can come up with. Secondly, there are no qualified candidates. There’s very, there’s death of a supply of females in the system. So what we do is we look at a set of 112 women. That’s all we've got in the fortune in the S&P 1500 in the last 25 years, about 104 female CEOs and a comparable sample affirmed with male CEOs. And we look at the board structure, and we see if there is any difference in diversity. Gender diversity on the board, and if it influences the selection of the CEO, male versus female. And what we find basically is that first in the boardroom that it matters. What does that mean, it matters? Well look at two particular, two specific hypotheses. First, we argue that female directors improve the quality of the evaluation of CEO candidates. That in some sense having women present in the room will even the playing field in some sense, so that everybody's being evaluated on an equal basis. And consequently you'd expect gender diverse boards to make a difference, and the data shows that that is in fact the case, that more women on the board the higher likelihood that a woman will be appointed the CEO. Jeez, secondly we thought. [LAUGH] >> One other thing we find is that,itt takes three women not one or two to make this happen. Thank you. [LAUGH] [APPLAUSE] >> To all we fix. So next up is, Alina Chircu from the informational process management department. Again going back to emerging information and communication technologies, and some of the ethical and privacy considerations associated with that. >> Hello. My focus in this research is to understand the broad domain of ICT ethics. That actually tried to apply some of the established ethical theories, to the field of emerging technologies. I am guided by a number theoretical underpinnings in ethics, as well as the meta ethical analysis and the number of large scale projects, the most notable of which is actually the Attica project. For information and communication technology in Europe. There are also a number of information systems researchers that have written about the topic, and especially in the domain of discourse ethics. And this is not, doesn't raise the attention of information technology researchers, but also researchers in other fields like nano technology. Where the concept of anticipatory ethics for emerging technologies has been been developed. I'm also guided in this research by my previous work on business value of technology, which over time has actually evolved to include the values of technology. Multidimensional perspective, multiple stakeholders, as well as the idea of contingent value. And so my approach is to look at two technologies that I have studied in the past. One of it is mobile technologies in developing countries. The second is RFID. And also then look in two technologies that I'm not that familiar with from a reasearch prospective, wearable devices and big data. And discuss some of the technical and some of the ethical trade offs that these technologies present. >> [APPLAUSE] >> And next we have Ryan Bolton from the Natural and Applied Sciences Department. How can material science be environmentally benign? [SOUND] >> [LAUGH] >> All ready! >> [LAUGH] >> All right, wow. Welcome to the business school right? No so my work deals specifically with the structure of how products reach the market and the beginning phase of that. So molecules come together to make materials. These materials are basically developed within the basic research realm right. Then you go to components, that's the applied research realm. You formulate devices from that, that's development, or engineering, specifically, and then you create products. But what's missing from this is really the environment, okay, in some ways, and working to close that loop, and then looking at inputs and outputs at every step of the process all right? And so most of my work, the classes I teach deal directly with how do you do this and how do you do this from renewably based sources? And where, well of course the triple bottom line is always important, right? Where the role of green chemistry kinda comes into play is not at the end, that's chemical policy or really environmental science where you're looking at treating the tail pipe emissions, treating the all right. Green chemistry plays in the beginning and the idea is to eliminate the hazards so that you don't have environmental sciences in some ways, there's a need for it. Right, if you design your products to be safe in the first place, you don't have those problems down the road. Engineering comes in to play in the middle where you're actually formulating those devices. My work is specifically in terms of materials, more specifically in terms of polymers and plastics. And what we actually do, if you look down at the bottom, we're taking in our labs, here we've got two Bentley undergrads that are working on making flame retardants, safe flame retardants out of wood basically. So, that should make you scratch your head, but in reality, it does happen. We're making safer soaps based out of apple waste and fruit waste. All right and we're making electrically conducting polymers that could basically be the face of this screen in a few years that are basically metabolites of tryptophan. So, that's what we're working on. >> [APPLAUSE] >> Next is Liz Wexpek from CIOS again, again we are going back to design theme here of good values infused in design. Powerpoint ignorant. >> [LAUGH] >> I'm in a well thought out department. >> [LAUGH] >> If we think about design either in the small or in the large, it's an expression of some intentions that we're trying to use in order to form a particular artifact. And the value that we experience has to do with the reflection of those intentions in the artifact and when we experience it. So design illuminates value but it doesn't necessarily create it. If we think of design as both a verb and a noun then we realize that value depends on the lens through which we express the intentions of the design, and then interpret it later on. Whoops, come on fingers. And so the why, the what, and the how of design, we have the why which is the intentions in order to get to the what, and we usually spend a lot of our energy on the how. It turns out that this metaphorical lens that with which we observe the reasons and then the interpretation is where most of the satisfaction and the effectiveness comes from. So that lens becomes an incredibly important part of how we understand what's going on and how it fits into the structure. So in thriving systems theory, we've developed a series of properties with which to evaluate what we see and what the lens is that we reflect on. And those properties through cluster analysis can be put together in order to identify design quality aspects that are involved in the process. Some of those are structural. And a lot of these are seen in software engineering. But a lot of these are new to software engineering because they're aesthetic or they're subjective in the process. And that's why we call it thriving because that's where they coincide. So if you want to design and infuse values, you've got to start by having explicit value propositions including both aesthetic as well as structural objectives that are involved. And we have got to have value sensitive requirement specification that incorporate those in the process. And finally, you have got to have techniques for building and employing it such that you don't destroy those intentions in the process of actually manufacturing the artifact. >> [APPLAUSE] >> While Kevin is **** that, let me introduce Linda, our final speaker, who's going to talk about ambidextrous innovation, and I'll leave it to her to explain what that means. >> All right, so I'm going to present a paper that's written actually, that's done. It says Craig Randall's dissertation so I'm just the talker here. Bob and I were both on his dissertation committee so don't hesitate to ask Bob all the hard questions that you have with respect to this piece of work. So, Craig's a real practical guy. He comes out of an industry he's a software engineer and he said, look, something that he noticed when he was at work is that firms have trouble delivering novel innovation. Well why? Why do some firms do this really well and why do other firms really struggle. That's what he really addressed in his dissertation. He framed it in terms of exploration and exploitation innovation, exploration research, new novel exploitation things that are already been done, incremental innovation that adds to something that's already there. And he's going to use agency theory lenses and resource dependence lenses. Well he did two studies really. He went out first and the talked to a bunch of people. So he went out and he interviewed some software firms and some venture capitalists. He really decided there is this difference. Something is going on with innovation. It's not just sort of my perception. And then he went and did a quantitative study to help him understand what drives those differences. Happy to tell you all about those later but he did a rather nice job. He has some really exciting models, we'll just leave them there. What he found is development plans do change during research and development. That indeed, it's not just planning, which is what the literature said in advance, but there's more going on. And it seems to be a problem all through the development process, manpower gets diverted. When manpower gets diverted, innovation, exploration innovation doesn't happen. Agency and resource dependence seem to be good lenses to view this. Unlike the previous lenses that were used. And indeed these changes originate outside the R&D, they're not correlated, some implications well for scholars these are good lenses to use. We should use different lenses other than the traditional ones we've been using. For managers don't change your man power, really keep people on their projects and you'll get more exploration research. That's it. >> [APPLAUSE] >> Okay, I'm supposed to sum up all of that now in three minutes. Okay, ten presenters, eight departments, And I'd like you in some cases to kind of think which department they were from. Because as I said before, there is a lot of porousness between our boundaries. A certain professor at London Business School used to say, God didn't make the universe to accord with the faculties of universities. And I think we've just kind of proved that. Some of the themes, the emergent themes we've had then under our overall theme of responsible innovation. Environmental issues, diversity considerations, ethical concerns, exploration alongside exploitation. A computer scientist who is building on the work of an architect. Privacy and ethical considerations, not from just a philosophical perspective, but from the fields of information and process management and computer information systems. Issues of trust, in relation to board appointees as well as in relation to financial reporting. Climate change, but not just climate change from the perspective of the science, but how we communicate issues associated with climate change. From a gender perspective, as well as communication from a gender perspective, as well as perhaps from an institutional perspective as well. So these are just a few examples, and we'll see more and more of those as we go on into the other sessions that follow this afternoon. So, my themes are breaking down disciplinary silos, research that informs policy considerations and practice, which is our next session. And then onto that in relation to curriculum innovation, thinking ahead of how our curricular should change given our research we're doing in this arena. Overall then, thinking more holistically and systemically, with sustainability and responsibility in mind. And to come back to that question, it's not an or in many of these instances, it's an and. So that's where the ambidexterity comes in. And from that, I want now our presenters to go to their posters, so that if there's something that really piques your interest in these very brief presentations. You can now discuss in the next half hour, in fact I think we've got 32 minutes, which is amazing. So follow the presenter to their poster, and then enjoy the discussion that will enfold, thanks very much. >> [APPLAUSE] >> I'll just make a quick note that we didn't build a specific bio break time into our schedule, but these half hours can be time to do that too. But don't leave our lovely presenters lonely! >> All right, trying to keep us on schedule, my name is Tony Buono. I'm a professor of management and sociology, and I coordinate our alliance for ethical and social responsibility. It is my privilege and pleasure to MC and key note listen on this panel on responsible innovation from the perspective of policy and practice. I will once again be calling on the multi talented Bill Wiggins, to help me keep everybody in line. Or as Warren Bennis, reflecting on his presidency at the University of Cincinnati once said, something akin to herding cats, so here we go. All right, our first speaker is right behind me, Jean BeDard from our accountancy department, Jean. >> Thanks, Tony. Behold the Gem of Tanzania. This ruby, quote unquote, was acquired by the owner of a UK construction company for 300,000 Pounds. Was subsequently found its way to the balance sheet of this man's company, where it was used to prop up the sale of stock. The company then fails, the ruby is found to be virtually worthless. The point I'm making is this is a valuation problem, be careful of asset values and financial reports. Now the use of fair values is increasing in accounting, now why is this? Well, fair values, current values are relevant, right? I mean investors want to know this, but fair values can also be open to falsification. Auditors like historical values, because why? They're verifiable, you can go to the file drawer, you can pick up what you paid for it. And boom, the auditor's happy. But the fuzzier the numbers, the harder it is for auditors to provide assurance. Nate Cannon and I did a field survey, we're asking auditors, what's hard about doing these fair values? And the auditors answered with respect to a bunch of different kinds of assets here, financial instruments certainly place a big role in this, but these things are pervasive. What are the problems auditors face? Well, these are often fairly complex things. The assumptions behind them are subjective, and they often involve looking into the future and predicting the future, which is difficult. So now here's the auditor, and management is presenting something with a very wide range of uncertainty. And the auditor is trying to reduce that uncertainty to the point where they can be comfortable that the balance sheet number is reasonably good. They do a lot of work and they still have trouble getting there, so this is very elusive. And what happens is it leaves uncertainty on the balance sheet, end of story, thanks. >> [LAUGH] [APPLAUSE] >> I feel like David Letterman, doing the Top 10 list. And now number two, Tony Gazinski from our Natural and Applied Sciences department, who will be speaking about sustainable interventions against malaria, Tony. >> So, my poster presents work that's a very small part of the puzzle of environmental management. And our work took place in Ethiopia, and what's unsustainable about current malaria control efforts? Well, most of it is unsustainable, it's mostly based on biocides that provoke resistance, and resistance is running rampant all across Africa. And without an effective vaccine, everything is unsustainable. And it's not just resistance, there's all different kinds of resistances, drug resistance, insecticide resistance. But there's other problems too, there's cash issues, user compliance, durability. Re-purposing of bed nets, people will use bed nets for just about anything, catching fish, protecting crops with nets. But nets don't last very long in households that have lots of little kids running around with sticks. So what is sustainable? It's actually, there's a lot of old technologies, basically filling holes and things like that, screening houses, that get ignored. And lots of studies get done on how useful they would be, but they just don't get done. So we're studying borrow pits, these are pits that people dig to re-plaster houses. And our poster basically shows that communities create their own problems with mosquitoes by digging these pits. But these pits have a very brief window of suitability for malaria vectors, so if you could manage them, Somehow by delaying how often they're dug with Portland cement added to the mud or something like that. You could really reduce the number of mosquitoes that get produced. So that's basically what our poster's about so thanks. >> [APPLAUSE] >> Thanks Tony, our third speaker David Yates from our computer information systems department, will be speaking about different paths to universal broadband access. Where he talks about the info sphere and the digital divide. >> Thank you Tony. So what's different about this work versus a lot of the other work that's been done in this area? Is that we're looking at the different paths through the developed and the developing world. Harking back to some comments from Lynn and Bob. This is transdisciplinary work. I am a trained computer scientist. Jeff Gulati is a trained political scientist. What's the problem we're looking at? So if you look at this graph, on the x-axis we have the GDP Of all of the UN countries for which we track broadband data. On the y-axis we have the number of broadband subscriptions, okay? So if you look sort of projecting that way you can see significant differences in broadband diffusion, okay? But it's not just to follow the money problem. If it were following the money problem I could draw some kind of regular function and all of those dots would lie on that line. So there's something else going on other that the GDP that's driving this. We wanted look beyond wealth and resources to try and understand what's going on that's different in the developing world. How did we start? We started looking at the political structure. So, if we're gonna look at policy, we need to understand the context in which that policy is operating. So, political systems, political values. We looked at specific regulations and what the actual institutions that affects this, the FCC in our own country are actually doing, except we did this on a global scale. Our approach is quantitative, and so we end up developing probably dozens if not hundreds of regression models. I won't show you any of those. There's two available on our poster. What did we learn? Really, six lessons. First of all, context matters more in the developing world. Okay so competition and democracy specifically as economic drive is for broadband growth are incredibly important. It turns out in the developed world we all have our MTV, if you remember that era. We all have our broadband for sufficiently developed countries it doesn't matter. In developing countries a little bit of national investment goes a long way and I refer you to the post for the rest. >> [LAUGH] [APPLAUSE] >> Thank you there, next speaker is Marsha Cornett from our finance department where she will be speaking about corporate social responsibility and its impact on financial performance I have co-authors but no time to mention them. >> [LAUGH] >> We are looking at corporate social responsibility in the banking industry around the financial crisis. We get our CSR scores from a company that's evaluating 3,000 businesses and 50 characteristics in the businesses for things like community development, human rights, corporate governance. They go through each of the 50 characteristics and they say, okay, you have a strength in CSR in this area you get a point of one. They do that for all 50, sum up the ones and that's their strength score. Then they do the same thing for concerns and that is the concerns score. This chart shows the all strength score for our banks by bank size. The red line there is the biggest strength. And you can see they consistently have the highest all strength scores and actually after the financial crisis, they go up significantly. The next slide shows the all concerns score. Again, the biggest banks have the highest all concerns but they do down significantly after the financial crisis. And the fact that they go down even below some of the smaller banks. So we said, okay, what's driving this? So we went and collected a bunch of financial statistics about the banks. Statistics about their board of directors. And what we found was that banks that were the most profitable, banks that had the most capital, banks that charged the lowest fees on their deposits and banks that had the most minority and female directors were the best at corporate social responsibility. So we then said, okay we see that financial characteristics drive CSR. The CSR then reward the banks with profitability. So we look at industry-adjusted ROAs and as you expect, it goes down during the crisis but they recover. And then we say okay, is CSR a driver of this and find no that it's not. So what we're finding here is that what financial characteristics do affect how much corporate social responsibility banks are playing, the banks are not being rewarded for these activities in the market. Thank you. >> [APPLAUSE] >> Thank you, Marsha. If you're following our program, unfortunately, the next speaker, Peter Barthon, from our marketing department had to pull out today. He is not feeling well. Sends his apologies. But he's doing some very interesting work on, what do you call, green digits, toward an ecology of IT thinking. And if you see Peter, if you're interested in this, I encourage you to talk with him when he's on campus. Cuz he really raises some very interesting perspectives on what he calls information views or new ways of thinking about technology. And they really were very much in line with what was talking about in our opening keynote. Really begins to differentiate between what have been traditionally instrumental views of technology, to more emerging views of what technology could be. So I encourage you to track Peter down and talk to him. Okay, our next speaker, Susan Adams, one of my colleagues in the management department, and our Center for Women in Business. Talking about responsible innovations, more inclusive work environments. Susan. >> Okay, [INAUDIBLE] set it up beautifully for me this morning, by saying that you can juggle two different types of values to create a more responsible organization or design. I have a very not so subtle social agenda and all the work I do by juggling two different values. One is that in developing effective organizations that contribute to society in positive ways. And then the other one is the organizations that provide rewarding careers for those in the organization. So there's two things are what I look at. That way, that way. And in his book, Howard Gardner has a book out called Changing Minds, one of my favorite books of all times. And he looks at how research can be a tool for social change and for changing people's ideas. And so what I do in my work and what we're doing at the Center for Women in Business, is action research that provides research to actually help companies change what they're doing and to help people do what they can for their own careers. In the first census on Women on Corporate Boards with Pat Flynn and Tony Wolfman, here at Bentley, we actually saw this working when one company said, okay. We're gonna expand our board and put a woman on there because they read our research report and was shamed into it. [LAUGH] But shame works, that's okay. It's guilt. So I stated out my work. I'm gonna run out of time. So the work at the Center for Women in Business provides a platform for helping companies and the study that I have on the poster over here is about how people can tell companies what they need to do to help them advance their women's careers. We asked actually all of those involved and had them write the report. It wasn't our interpretation, it was there's. This was an innovative study using an An online discussion. >> [APPLAUSE] >> Thank you, Susan. Our next speaker is Anne Schnader from our Accountancy Department. Where she'll be talking about auditor reporting in a broker-dealer industry. >> Oops, this work or it doesn't work? What is the, it does? It doesn't? [SOUND] >> Just hit the buttons, the down buttons. >> Okay, I'll hit the buttons. Okay, I'm very excited to tell you about all of this. Broker dealers are a central force in our capital markets, right? But we can't rely on our traditional oversight mechanisms for them because most of them are privately held companies, so they're not subject to the same oversight that we're used to holding firms accountable to. I went way past. They also have no fiduciary duty to their clients. I'll bet that's a big surprise to you all, right? You got a broker and you figured they are looking out for your interest, they do not have that legal responsibility. So it's very crucial that we have very industry-specific oversight and regulation, and there's been a lot of advancement in that area in the last decade. And even just in the last month, there have been some new regulations coming out about that. Important to everyone, I'll bet every single person in this room has a stock, a bond, a mutual fund somewhere. And you're assuming that transactions are only happening if you've authorized them, that they're happening the way that you expect them to, and that your assets are safe. And hopefully, you did not have an account at MF Global, okay? So, the internal controls are at the cornerstone of some of these fix scandals that have been going on out there. The PCAOB has done some inspections and found in more than half of their inspections that the authors are really not considering the possibility of management override like they're supposed to, and that was the crux of the problem at MF Global. So we have taken some time, there's not a lot of research out there. We're taking the time to look at and try and identify what are the internal control problems that the auditors are in fact citing. So try and get an understanding of what's actually happening there. And then also we looked at it by auditor class, because more than almost 80% of all audits in the broker-dealer industry are done by smaller firms. They do not have the global network behind, they don't have the expertise and all of that behind them, and we find some really interesting results among consumers. >> [APPLAUSE] >> Thanks, Anne. Our next speaker, Marc Resnick, from our IDCC Department, will be talking about ethical ICT to restore the privacy equilibrium. >> Okay. As we heard very eloquently in our lunch time keynote from your own, there is a tension between security and privacy when we are thinking in terms of innovation 2.0, I just stole that word from him in real-time. So, what I am going to be presenting in my poster today is the way that we can cut the rope and move to what he called innovation 3.0, which is to have both. And there's really no reason why we can't have them both if we use innovative implementation, not just of technology, but also of analytics, which is more of an algorithmic kind of solution, as well as smart policy. And there are recent history in the last few weeks of showing us maybe smart policy might be the hardest of these three to get to. Then he gave you lots of examples, and he had a lot more time than I do. So I am just going to say think about all the great examples that he showed you. And if you want to learn more about what my poster is, which is actually how to do it, you can come take a look at my poster. And it's also available on SlideShare and LinkedIn if you'd like to download your own copy. Thank you. >> [APPLAUSE] >> Thanks, Marc. Our final speaker for this segment on policy and practice is David Szymanski from our Natural and Applied Sciences Department. And his focus is impacting energy policy through student engagement. >> Good afternoon. The research I'm gonna present today kind of straddles disciplinary boundaries between scholarly research on pedagogy and my work on both service and policy. So, I want to start off why focus, students, on energy policy, why does it matter? If we look at a recent survey that polled about 1,000 adults, asking them to name various sources of energy. When they were asked to name a fossil fuel, 7% got it wrong when they named it. But astoundingly, 32% couldn't name one. We looked at a renewable resource, 21% got it wrong and another 30% couldn't accurately name it. This is staggering, in terms of sustainability, in my opinion, energy is a linchpin. We can't do good energy policy unless we have energy literacy. Which is why I teach a course in part called Science & Environmental Policy, the students in this seminar course look at a scientific issue behind a piece of environmental policy. And then we try to figure out how the science makes it in or doesn't in most cases. A subset of these students then work on a fourth credit associated with the course by doing some original research on policy-related to energy for Environmental and Energy Study Institute in Washington, DC, a nonprofit. The students actually conduct the design of the research, and then they go ahead and actually go out and they survey, they do interviews. And then after they've compiled a report for our nonprofit partner, they put together a one-pager for policy makers, and we go to Washington DC at the end of the semester. In DC, they then present their one-pager and their policy recommendations to policy makers. For example, this year we met with Gina McCarthy, the new Administrator of the Environmental Protection Agency, or to staffers, or actually the senators and congressmen on Capitol Hill. And the effect of this is that we actually encourage policy actions. And come to my poster and I'll show you some of the other results. >> [LAUGH] [APPLAUSE] >> See, Bill, it takes some of your stature to get faculty to actually listen to your thing. In reflecting on, one of the challenges is bringing together seemingly nine very different presentations. But reflecting on your own comments, and I was really pleased when he started talking about responsible innovation, and he referred to the UN Millennium Development Goals that are set to expire in 2015. But just a little over three weeks ago, in New York City at the Global Leaders Summit, the UN has put together a new architecture, especially with the focus on the business community for the post-2015 Millennium Development Goals. I'm also pleased, and I'll use this as a little bit of advertising, is that in just over two weeks, Georg Kell, who is Executive Director of the UN Global Compact, will be our Raytheon speaker. I believe, he had to move it, he was supposed to be here the latter part of October, he had to speak, he was called to speak before the UN General Assembly on progress on the Global compact, so we had to delay it for a week, but he will be here on November 5th and I encourage you to come. Because I think when you're all referred to the golden triangle of bringing together ethics and values and information technology, and then accounting and finance, it really did capture this new engagement architecture that the UN is creating. Now, I'm sure you all realize, and if not I haven't done my job very effectively, is that Bentley is part of the academic network of the UN Global Compact, and we are one of the initial signatories for the PRME initiative, the Principles for Responsible Management Education. I think Pat will be talking about that a little bit in the next segment. But if you look at this architecture, it really focuses on all three aspects of that global triangle. So we're talking about Sustainable Development Goals, talking about, and integrating those long-term business goals, also though with transparency and accountability that really draws in accounting and finance. And so we're clearly talking here about measurement practices, reporting standards, and certification schemes. And I think the presentations that we heard lend themselves very much in support of these different initiatives. And if you look at the resource base and the areas of interest that the Global Compact has, they're talking about an enabling environment which includes peace and stability, infrastructure and technology, and then good governments and human rights. The resource triad, which involves food and agriculture, water and sanitation, and energy and climate. Human needs and capabilities, which encompasses education, women's empowerment, and gender equality and health, and then finally inclusive growth for a very broad range of stakeholders. I think the nine presentations that you heard just now, and hopefully we'll spend time to explore in more detail, hit all of these different dimensions. And I'm very pleased because one of the things that the UN Global Compact has been asking schools, especially business schools, to do is to not only embrace the principles of PRME but take the Sustainable Development Agenda and incorporate it in what we do on campus and what we talk about in our classrooms and the research that we do. And I was very pleased to see the systematic link in drawing in the research that you are doing in support of these very critical goals, so thank you. And again, I encourage you now to, we have another what? Half hour to wander around, take a look at the posters, and talk to our researchers. Thank you. >> [APPLAUSE]
2013 Bentley Research Council Colloquium - 10/28/2013 - Part 1
From Kaltura MediaUser March 29th, 2017
6 plays
6
First Annual Bentley University Research Colloquium, “Responsible Innovation,” a showcase of Bentley faculty research. Keynote by Dr. Jeroen van den Hoven, Moral Philosopher, Delft University, and Bentley Trustee. Co-sponsored by the Research Council and the Office of the Provost.
- Tags
- Appears In
Link to Media Page
Loading