FacultyFaculty/Author Profile

Crisis Management: What to do Before, During and After


ALLEN: Dick, take it away.

RICHARD H. WALTERS: Thank you very much Allen and Meredith and I think to PLI. And a word of thanks to our hale and hearty audience here who showed up on a really dreary, rainy Monday day. If I were taking a vote, I would say you should get double CLE credits just for showing up. And of course, to the folks on the webcast this morning, too, hopefully in a dry, warm location.

Let me start by introducing our great panel here. We've got three terrific panelists. The first to my right-- Shelley Dropkin, who is managing director deputy corporate secretary and general counsel for corporate governance at Citigroup. And to Shelly's right is Andy Rose, who is a partner at your Joel Frank-- the large IR and PR firm. She works with public companies to develop, implement, and enhance IR and PR programs to help them achieve their business solutions, and in particular has focused among other things, including M&A on crisis communication. So she's a terrific complement for our panel.

And to Andy's right is Steve Rosenblum from Wachtell, Lipton, Rosen and Katz where he's been since 1989. And he's the co-chair of the firm's corporate department. He focuses on M&A takeover defense and corporate governance, which hopefully will come in very handy for our discussions today as well.

So the topic that we're going to kick off with is one that is increasingly sort of leaped to the forefront as Allen mentioned. Not just in boardrooms, but in C suites, in general counsel, in legal departments and in communications centers as well. And that's crisis management. If you scroll back and think about some of the large crises that have taken place in years past, from the Bhopal crisis to Exxon Valdez to BP oil spills to some of the more recent crises, such as VW, Wells-Fargo and of course, the MeToo movement, which has achieved such importance and is a driving factor.

I think many people focus on the Johnson & Johnson Tylenol situation, going back into the '80s, as kind of the baseline for talking about this subject matter. And for those of you that remember, back in the late '80s, there were several pharmacies that discovered that bottles of Tylenol had been tampered with and poisonous substances had been introduced in the Chicago area.

And J&J was widely lauded and credited for sort of leaping to the fore, basically putting economic and commercial interests to the side in favor of public safety concerns for users and the public and voluntarily initiated a complete recall of that product from the shelves before being told to do anything like that by the FDA. And it's then CEO Chairman James Burke was I think widely viewed as a hero of that situation for sort of seizing control, being the public face and the public voice of that crisis.

Since J&J, I have to say, most crises have been marked by things that have gone wrong, not things that have gone right and have served as sort of bases or lesson learning areas of things that need to be looked at and probably done better. So what we've tried to do for today is to come up with sort of a real world type of situation to kind of tease out some of the issues that are faced in many, many crises. But in this particular hypo that we're going to be using today, it's very relevant to an area that is increasingly the subject of crisis situations. And that is in the cybersecurity area.

So we start with sort of a baseline and we're going to be talking about a large big box retailer called Big Deal. And Big Deal competes with all of the other large companies-- Walmart, Target and other large retailers. Big Deal seeks to differentiate itself from some of its competitors through an advanced and very sophisticated technology platform called customer first, that facilitates data gathering about customers preferences-- things they like and they don't like, sends email alerts to them about sales on products and then expedites their purchases and deliveries through customer accounts that are linked to individual credit cards. So lots of personal, what they call, PII-- personally identifiable information, is collected through the customer first system.

During the past five years, Big Deal has experienced attempted intrusions into its customer first system on virtually a daily basis. And this sounds like a very high incidence, but I think many that may have affiliations with public companies will tell you that it's scarily not such an unusual situation to have attempted hacking going on regularly real time almost all the time. In addition, customer first there have been several successful hacks at the company, people that have been able to get into their system. But they've only resulted in very minimal losses, quickly buttoned up. And any customer losses were promptly reimbursed.

So starting with that as a bit of a background, let's talk about the things that can be done before a full tilt crisis hits. And let me let me throw it up to the panel, maybe start with you Shelly. Should the company have a plan for something like this? And please.

SHELLY J. DROPKIN: Sure. So first, I just want to say that I'm really happy to be here. The views I express today are my own. They're on behalf of the city and they're not for attribution. So yes, I think companies should be thoughtful, particularly in the cyber area, about having plans for reporting on any kind of crisis and dealing with the crisis. It doesn't have to be a formal written we keep it on the shelf sort of plan, but I think that all of the functions should be aware of what their roles are, who's going to communicate with whom and who needs to know what/when, as well as what's going to be communicated publicly to the extent something will be communicated publicly, what needs to be included in a company's SEC filings, both in anticipation of anything ever going wrong and then, deciding whether something needs to be reported subsequent to something happening.

So I would hate for any company to be caught flat footed with an incident. And with cyber being what it is, I can't imagine that there's many companies out there that don't think there is the potential for impact on them.

RICHARD H. WALTERS: So is in your head of sort of counseling the board, what should the board's role be in connection with the thinking about the plan that you've described?

SHELLY J. DROPKIN: So I think the board needs to be sure that it is providing proper oversight. And that it is getting the information that it needs, so that the first time something happens on this isn't the first time they're hearing about it. And of course, the amount of information you're going to provide to a board is different than the amount of information and the timing of the information you're going to be providing publicly.

But I think boards need to think about where the appropriate forum is for them to get certain kinds of information. Here, we're talking about cyber. So everybody kind of talks about, should it be at your audit committee? Should it be a risk committee? Should it be at the full board? Should it be, if you have one, at your operations of technology committee? Should it be everywhere? But I think the board needs to be thoughtful about, do we have people who are familiar with these issues and can understand and help guide management, who are obviously the ones who are responsible for handling it, but who can use some guidance from the board?

And do we have a proper forum or forums where the board is regularly getting updates so that they can understand what has gone wrong, where it's gone wrong, the degree to which it's material to the company and to shareholders and to customers and how management is prepared to address any kind of incursion by hackers. So I think there is, the board's responsibility here is as always oversight. But it's not oversight only when the crisis happens, it's in advance.

RICHARD H. WALTERS: In advance. With preparation-- an ounce of preparation is worth a pound of cure. So you said that there needs to be somebody at the board level that's familiar with these issues. Does that mean there should be a digital director, or is that not necessary?

SHELLY J. DROPKIN: I think everybody has got to think about this their own way. You don't necessarily have to have a digital director, but I think if you've had a director who's served as the CEO of a company that faced cyber issues, certainly, that's someone who could handle it. You could have a director who has direct experience with cyber. You could have a director who understands operations and technology issues.

I think every board has been think about how significant the issues are for that company and they have to think about what kind of board members that they can get to serve on their board who can both provide good input and oversight on these issues, but also be broad enough to serve at the company. And that's not to say that you'll have board members who are not the cyber director, who aren't great providing oversight on cyber.

So I think different boards have to think about the level of the risk and what kind of folks they want to have on the board who can be thoughtful about those issues.

MEREDITH: I'll jump in for one second on that. I think there's a lot of concern right now that boards are going to end up with just a collection of experts, who are not a good cohesive group of board members who can oversee a company in that the crisis of the day will be what you get a board member who's good at it. And I don't think anybody thinks that's a good governance plan.

SHELLY J. DROPKIN: I agree. I think you have to have a balance. You have to have directors who understand the risks that face a particular company, but who are multi-skilled and who can provide oversight of other issues and be collegial and do all of the other things that we want board members to do.

RICHARD H. WALTERS: Is it sufficient to have a person who is a former CEO or someone that comes from that tech sector, can all the other board members sort of just say, OK, that's her responsibility?

SHELLY J. DROPKIN: Absolutely not.


SHELLY J. DROPKIN: Absolutely not.

RICHARD H. WALTERS: What is the responsibility of the other board members, even if they lack that kind of background?

SHELLY J. DROPKIN: I think the other board members have to become informed, they have to ask right the right questions of management, they can certainly seek you know the expertise of the board member, but it's not up to the board member to be running this. This is really boards as always, as we always say, are dependent on the information that they get from the management. And so, having someone who is good at this doesn't mean that everybody else is relieved and it means that management has a responsibility to educate the board members on what the risks are, what the prevention looks like, what the plan looks like and to keep them updated on those issues. This is not a once a year issue.

STEVEN A. ROSENBLUM: Maybe I could jump in just about a little broader context when you think about cyber risk and oversight in management. Cyber risk is really a subset of risk generally. And risk management, risk oversight is an issue that's been an issue for boards for decades. And really, we've talked a little bit about the separation of the function of management, function of board.

As a board member, and this should be for all of the board members, you want to be confident that whatever the risks that are inherent in your company are being actively managed by management and by outside experts to the extent necessary. For a company like Big Deal, obviously data protection is core to their business model and mission. And that's one of the things we always talk about in terms of risk oversight and risk management is, you don't want to think of it as a parallel function to your business. It's part of your business-- managing risk and figuring out what the risks are for your company.

Risk management is very much a fact specific company by company exercise. And each company needs to think about, what are the risks that are most important to that company and that company's operations. And for the board, I really do think it's a very useful exercise. I mean, typically, you have risk management housed in the audit committee, or some companies have risk management committees specifically set up.

But I think it's important for the full board probably at least once a year to have a presentation on the company's risk management processes. Overall, what the management of the company is viewing is the biggest risks for that company and how they're managing them. And the directors need to be comfortable that it is getting the proper focus. They're not going to do the day to day. I don't think you need an expert in any specific area, but you do need focus and attention that whatever the risks that are most important for that company are being addressed.

RICHARD H. WALTERS: So Steve, very impressed with your thinking, we want to hire you here at the company and get your advice on what our crisis management, or in particular, our cyber plan should look like.

STEVEN A. ROSENBLUM: And the best crisis management is not allowing it to evolve into a crisis in the first place. This hypothetical, as you're going to see is set up to show everything you shouldn't do.

RICHARD H. WALTERS: So tell us what the plan, the overall plan, which includes things at the front end, things that you would advise us to do at the front end so that we never get to the situation, or if we do get to the situation of a full blown crisis, we're well prepared. So we're going to hire you to give us a little advice on how you see that we should proceed with our crisis plan in this area. What would you tell us?

STEVEN A. ROSENBLUM: Well, as I said, for this company, cyber risk is clearly one of the most significant risks to the business and the operations. And management needs to be focused on, how are they protecting the customer data? As you said, an ounce of prevention is worth a pound of cure. Spending money upfront to protect data. And no one, there's no perfect way to protect data. I mean, companies that have put a huge amount of resources into data protection still get breached, still get hacked.

But focusing on are our systems state of the art? Is the data protection as strong as we can make it? Are we testing it? You've already got on this very first screen, and we're going to start seeing even bigger red flags, but you've already got a bit of a red flag on this first screen in terms of, you know that hacks are being attempted and some of them are successful. That's something that you need to be focused on, how are they getting in and how do we stop it before it becomes a bigger problem?

RICHARD H. WALTERS: So what's your view on--

ALLEN: Dick, I'd actually go back even a step or two earlier when big butt, when Big Deal--

RICHARD H. WALTERS: Big butt? No, Big Deal.

ALLEN: Big Deal. Yeah, I'm looking now. When Big Deal started to develop this very interactive system with customers and decides to take on for itself this huge trove of personally identifiable information as part of its business model, or the business model it's developing, somebody on the board should ask the question whether management, in following this business model, is paying adequate attention to the cyber risks that it creates even before we get to the point where we are now.

RICHARD H. WALTERS: So I think that we've heard one of the key elements is a full blown risk assessment. I think, Steve, you're absolutely right. Every company is going to be different in terms of their exposure, if there is a hack, what can happen, how dependent upon you are you for your business model on customer information, how vulnerable is it? What is your view Steve on whether there should be written policies and procedures? I think, Shelly, your view was, it doesn't have to necessarily be in writing, as long as everybody knows it. You're going to council us.

STEVEN A. ROSENBLUM: Written policy is going to be a two edged sword. The risk of written policies is, you write up a set of policies that look great on paper and then you don't follow them. And then that becomes evidence for negligence if something goes wrong. So I think that if you have a system that's working and you have the right management focus and the board is comfortable with that, you don't need a set of written policies, you just need the implementation of what the policies would be if they were in writing.

RICHARD H. WALTERS: So Meredith and Allen, doesn't the SEC suggest in its most recent statement that they really want written policies and procedures and they're looking at the board to make sure that those are in play?

ALLEN: I mean, yes, that's true. The cynical view of that is they want companies to fall into the trap that Steve is talking about, which is to set up-- they want companies to set up an unrealistic paradigm that they can then complain that the company didn't follow. The less cynical view is, it gives regulators and overseers and frankly, it gives directors comfort that there's a piece of paper somewhere that tells everybody what they're supposed to do. And it's easier to demonstrate that you went through the effort of telling everybody that that was what they were suppose to do if you have the written policies. But I agree with Steve, it is very much a two edged sword.

RICHARD H. WALTERS: But aren't policies and procedures a mainstay of corporate America? Everybody has policies on everything. So how much greater is the risk here than it would be in any number of other areas where you've got to have policy?

MEREDITH: I would focus also on-- so the guidance talks about disclosure, controls and procedures and the kinds of other matters that the SEC really has more direct jurisdiction over, as opposed to an entire set of policies and procedures around cyber. So I think that, yeah, you need disclosure controls and procedures on cyber.


MEREDITH: Yeah, and trading windows and all those things, things to close the trading window if need be. I just think they are reminding people that your disclosure, controls and procedures and your procedures running into our trading and all that need to cover cyber too. There is a risk with cyber that it get's kind of lost down with cyber experts somewhere in the company and doesn't bubble up.

And so, that's how I read that. I don't really think that this is different-- that guidance is different. It was sort of additive to what was written in 2011 when--

RICHARD H. WALTERS: We're going to get back to that, because I think the author is someone we all know. 'Cause he's sitting two seats to the left.


SHELLY J. DROPKIN: I agree. I would just add also that the concept that you were talking about earlier, Steve, that cyber is one aspect of risk management. Presumably, these companies all have policies around risk management that also address cyber risk. So I think, when you're looking at where the policies have to be, you don't need necessarily a step by step-- this is what we do in a crisis. But you do need policies and procedures around the oversight of particular risks that are faced by a company, as well as policies and procedures that cover your disclosure controls.

So that was how I read it, but I didn't see anything that said you needed to have a crisis management policy.

RICHARD H. WALTERS: So Andy, let's get you and your expertise involved, because we haven't had any discussion about communications. So pretty important thing to be thinking about, right? If something goes wrong, how do you think about your internal chains of communication? How do you think about external? So if you're coming in, if we bring you in as well with Steve at the ground level before something bad happens when we're going to be bringing you in any way, what's your advice to us about how we should be thinking in advance about an effective communication plan?

ANDREA ROSE: Sure. And, by the way, I think it's all right that we haven't yet covered communications, since we are in a group of lawyers here. But much of this-- crisis communications, data breaches, MeToo-- we've been talking about a few things as you intro'ed. Ultimately comes down to how a company communicates around an issue. So in a crisis, any corporation is going to really be judged by two criteria. So the first is its role in enabling the crisis, or in this case, failing to prevent data breach. And then the second is, how it communicates around the crisis.

So if something's happened, only one of those things is in your control. So we can talk a little bit about that. But just to take a step back and reference policies, whether companies should have them written-- not really our job to decide. But one thing I would say is that while the board should always have regular updates on data breaches, something that we've been doing recently with more clients is like a simulation, where, and Wachtell does this too, where you actually make the board management sit in a room together and live through something like this to figure out how they would do it, because there's--

SHELLY J. DROPKIN: Understanding--

ANDREA ROSE: Exactly, understanding best practices in theory, but then, when everyone's in a room and all of a sudden, actually, well, maybe it shouldn't be by a committee, we all want to be involved. So some of that can be sort of worked out. So I would say, early on, advising a company, it's getting the board management to sit together and figure out how they would do it in real life and put some of those practices into reality.

RICHARD H. WALTERS: Great. Table top exercise is certainly on everybody's check list of a good board oversight and good pragmatic approach to the management of the risk. So you said people are going to be judged by the way that they-- whether or not they enabled the crisis. Let's twist our situation and add a few additional facts. And the additional facts are that in 2015, after all these repeated hacking situations, up to the customer first system, big deals audit department was asked to do in order to the company's systems.

So the audit revealed a number of different deficiencies, including the company's failure to conduct regular penetration test to see if people could get in and its use of outdated and in some instances, software that was no longer supported by vendors. That was particularly vulnerable to being hacked. So internal audit basically recommended that Big Deal implemented an accelerated program to conduct regular penetration tests and also to replace outdated and retired software within six months time.

The program had the projected cost of $20 million. The audit report was presented to the board's audit committee. Recommendations were duly noted and approved. Audit committee didn't take any further steps to address the remediation or to bring recommendations to the full board. So on it's own, company management deferred the audit recommendation several times in view of the cost of the project and the companies need to spend a lot of money on aggressive overseas expansion.

And fast forward to 2018, as to the date of the filing of the company's most recent 10-K in February 2018, management still have not implemented the recommendations and the outdated software hadn't been replaced. So Shelly, back to you, in terms of your role of looking at the governance function of boards, what's your take on the audit committee's handling of this, if you will? And is this something that the audit committee should have reported to the full board or how should this have been managed?

SHELLY J. DROPKIN: So I broke out in hives when I first read this.

RICHARD H. WALTERS: I'm sorry for that.

SHELLY J. DROPKIN: That's all right. I'm good now. So a couple of thoughts here. And companies and boards handle things different ways. A practice that I would think is pretty universal is that committees do full reports to their boards, whether they do them in writing or they do them verbally or they do them both. And boards are also provided with generally, as far as I know from my colleagues, minutes of the meetings of the committees. And the idea there is that you have committees so that you have folks who can focus on certain issues on behalf of the full board. But then they report back so that the full board is informed about what is going on.

Here, you have a presentation made to the audit committee acting on behalf of the board. They noted them and they approved them. And when you say here, or whoever wrote this says, the audit committee did not take any further steps-- I always get into the nervous land, because what's supposed to happen there is management is supposed to take the further steps. The audit committee, of course, is supposed to get reports back from management that they've taken the steps. And where the board failed here most likely is that they did not oversee properly the remediation by management of this issue, nor did they update the board on what was potentially a significant issue for the company.

So I know Steve's going to come back and talk about what's going to happen to them afterward. But here, to me, the biggest fail here is that once the board approved this, there should have been a-- excuse me, the audit committee approved this, there should have been a-- I want a report at the next meeting on the steps you've taken and how we're doing on this. And even beyond that, I think there should be a report at the full board on the decision we've made and what you're doing. So that's kind of what I see is the fail here.

RICHARD H. WALTERS: So Steve, set it up for us so that this thing doesn't fall into the dark hole. Give us some advice on what the audit committee should have done and what should have happened.

STEVEN A. ROSENBLUM: Well, I mean, so many things are wrong on this page. Basically, everyone is living in a cave and hasn't been reading the newspaper about the huge data breach incidents and the repercussions of that. The notion that you have all of this sensitive customer data, including financial data, I mean, you look at the situations where people have paid large amounts of money in the aftermath of a data breach, it's where credit card information is being stolen and sold and running up bills.

So you have, I think Target ended up with a few hundred million in cost. So the notion that they're going to fail to spend the $20 million because they have overseas expansion as opposed to fixing what they know is a flaw in their system-- protecting the customer data-- that's a management failure. The idea that the audit committee is going to listen to this and nod their heads and say, all that's very nice, and then eat the sandwiches and go home is an audit committee failure. The fact that this is being done in 2015-- this audit-- and by 2018, the full board still hasn't heard about it is a failure.

The only thing they did right here is they conducted the audit in the first place. But then, they got the findings and completely ignored them.

RICHARD H. WALTERS: So just from a former in-house point of view, and again, from a German company-- Germans are driven with sort of process. But the thought of an audit point not being tracked or ranked as significant or by number and not having dates for follow up and not having all of these things done so that there is reporting back when the audit point is closed is mind boggling to me and very, very risky. If you've created the records for yourself that you've got a problem, and if your own internal audit group has done that. And then, nobody's looking at it, tracking it and making sure that it's closed with action plans, you're asking for trouble.

ALLEN: There's a very specific fix here that I think most companies engage in, which is, when the internal-- I assume this audit is internal audit.


ALLEN: --this audit. Certainly, the audit committee I'm on-- the internal audit-- the head internal auditor and his deputy come to every quarterly audit committee meeting. We have an executive session with them at the end of-- at each meeting. And the material they provide of course has a summary of the results of all the audits they've conducted in the prior quarter. But it also has a couple of pages on-- here is the status of remediation of the issues that were identified in all the prior audits until they're done and fall off the page.

And so, the notion that the audit committee didn't every quarter see-- oh, we spent $0.12 on this, but we couldn't spend any more on it-- is simply beyond the pale. The only other point I'd make is, we don't know how the big Big Deal is. And I don't know if $20 million is material to Big Deal or not, but I also don't care. Because, when the audit shows risks to this aspect of the business, I don't care if it was $2 million to remediate it, although that makes what they did even worse.

RICHARD H. WALTERS: You should have said you don't care if it's $100 million.

ALLEN: Also don't care if it's $100 million. I mean, Wells Fargo is the object lesson for-- the amount of the fine doesn't make any difference.

RICHARD H. WALTERS: Andy, is this one of those situations where you're sort of creating an enabler that is a difficult thing from a communications point to deal with?

ANDREA ROSE: Yeah. Yeah, I mean, look, there are many problems with this as we've discussed. So is it the failure of board, failure management, or frankly, both? So beyond just litigation and other things that the company would be subject to once this is disclosed. Fundamentally, there will be a crisis of trust and confidence in the company and/or in the board. So there could be a question of, was this a management failure, after this is all announced. Is the board going to then make a management change? Or even worse, is it going to be deemed sort of a board issue? And you're going to have an ISS recommend vote against, which we've seen in other data breaches. And a crisis of confidence of trust of the shareholders who are going to think, what else is being so poorly managed? And this is a dysfunctioning board and there should be change or I'm going to sell my shares.

RICHARD H. WALTERS: So how are you thinking about, knowing this now before there's any blowout, what's going through your head on how you might manage this in the event that something terribly bad happens?

ANDREA ROSE: Yeah. Look, I mean, I think the reality is that it might be too late. If there is no major breach, could they fix things and like, no one would ever know, maybe. But the point is, if there is a big breach and this all comes out, it very much is too late. So if we were working with the company, we would first of all, try to figure out what exactly has happened. But then, who is in charge at the board level of managing this going forward? What steps is the company going to take based on the findings of the review? How can they strengthen their systems now? Who can they hire as third party experts? And go through the process of figuring out if they can keep some of this safe before things, they get disclosed.

STEVEN A. ROSENBLUM: And maybe if you think about management succession if this all comes out.


RICHARD H. WALTERS: So back to our situation. So that audit was done in 2015. And by the 2017-K, they still haven't done anything, they haven't acted on these things. So let's look at the company's disclosure in 2017-K. And here it is. I'm not going to go through all of it. I'm just going to focus on sort of towards the-- the first part of the informational-- we process, we store large amounts of data.

But then, sort of going half way down, the disclosure says, we and some of our vendors have experienced past attempts to access our data-- true. But there have not been any material intrusions to date. OK. We maintain rigorous oversight and testing of our systems and processes to reduce the impact of a security breach. Although our processes and systems are designed to protect customer information and prevent data loss and other security breaches and are regularly updated, such measures cannot provide absolute security.

So a junior associate sort of hands this off to you for a review, Steve, let's say, before the K is actually filed. What's your take on this and what do you do to address this disclosure?

STEVEN A. ROSENBLUM: Well, I mean, this is exactly the kind of generic disclosure boilerplate that the SEC released, and really the 2011 guidance before that. And guidance generally on disclosure says, this is not what they want to see. We know that the company is aware of vulnerability in the system, which they don't say anything about here. We know that the company has had intrusions. And they do say we've experienced past attempts to access their data. They don't really say that those attempts have been successful or that they've had to pay off customers to address them.

The company obviously is in a difficult situation, because honest disclosure is going to subject them to a huge amount of criticism, because they haven't been doing substantively the things they should have been doing to address this problem. But the disclosure, I guess, compounds the potential risk and liability when and if there is a problem when it comes out, because the disclosure really doesn't disclose known issues and known risks and known events.

MEREDITH: But it strikes me, it's actually affirmatively misleading, because they don't maintain rigorous oversight and testing and they're not designed to protect customer information. And that's part of the concern of the guidance--

RICHARD H. WALTERS: It's in an audit.

MEREDITH: Yeah, they did an audit and then they did not do what it said to do. If it were me, I would change this to say, we are, hopefully, by the time it's drafted, in the process of upgrading our systems so that we have rigorous something, something. So that it would at least not be false as written.

RICHARD H. WALTERS: How about the regular update part?

MEREDITH: That's not true, so they can't say that.

STEVEN A. ROSENBLUM: But I think Meredith's point is a good one, which is, she says, hopefully, by the time you do draft the corrective disclosure, you can also say that your doing about it. And I think one of the things that-- hopefully, a company is addressing these issues without having to have disclosure as a prompt. But disclosure can be a prompt. And I think that's behind some of the SEC's guidance is they want companies to have some substantive controls and procedures. And since their jurisdiction is disclosure, they get at it through disclosure.

If you have to disclose what you're doing and what your risks are, you're not going to like the disclosure, or if you're not addressing the risks, and therefore, you may be more inclined to address them.

MEREDITH: Indeed, if you look at the timing on the 2011 guidance, there was a lot of legislation getting introduced to require very specific cybersecurity activities by companies. And there was also pending legislation about disclosure. And the SEC's guidance at that time and the staff's guidance was designed to have people explain it.

ALLEN: I think the 2011 guidance directly addresses--

MEREDITH: Yes, it does.

ALLEN: --precedent in this.

RICHARD H. WALTERS: Yeah, you don't need to go beyond that.

ALLEN: Exactly. Can I ask the panel a question about this disclosure? If you read this in 2015 or in 2017, would you advise the company, or, Shelly, would you be uncomfortable if the company wasn't including something that talks about the fact that the customer first business model results in Big Deal having more customer identifiable information in its system than its competitors and raises cyber risks that aren't necessarily present in its competitors? You can gussy that up and sugarcoat it any way you want, but should something like that be in Big Deal's disclosure.

SHELLY J. DROPKIN: It feels like if that, number one, is part of their strategy, presumably, they've talked about that. So it shouldn't be that that's a big secret, but they should be addressing the risks associated with that strategy. And I thought where you were going was, here, we're talking about the 2017 10-K, what about 16 and 15? I mean, even if we fix this now, we've got two years worth of misleading disclosure. But yeah, I mean, it feels to me like what the SEC has said is, what's material to you?

And if your business model is one that's going to rely on the collection of this information, what's material is your ability to protect that information. You have you are at higher risk of a breach and a breach being meaningful to your customers and therefore, meaningful to your shareholders. So the whole materiality of what they're holding onto and how they're protecting it seems to me is more significant than their peers. So it feels like there should be something more.

STEVEN A. ROSENBLUM: Yeah, I think that's absolutely right. I mean, I don't know that you need to highlight that it's the comparison to competitors. And I do think, actually, a number of retailers have a lot of--

ALLEN: They all have something, they all have a lot.

STEVEN A. ROSENBLUM: But the fact that you're storing this amount of sensitive data is a key reason why the whole question of data protection is so material to this company in this business, and you've got to be addressing it.

MEREDITH: The way I look at these disclosures is you first-- it's kind of like, first, do no harm. So you first make sure it's not just affirmatively misleading. So that would be good--


--a good first step with this one, because they would have I think charges from the SEC on this one if the facts came out. And then, I think it's probably a bridge too far to suggest that they have to say that they have bigger risks than their competitors. But they can certainly note that they have significant risks, which they sort of do, because they do say they have a lot of data.

So I would hope that we could use this as a teachable moment for this company to have to use the disclosure as a prompt to start fixing it.

RICHARD H. WALTERS: So Meredith, SEC says that there also needs to be consideration in addition to risk factor disclosure about MD&A disclosures. So anything here that would warrant MD&A disclosure in your view? Well, there could be. I mean, they could have a need to talk about how much the unknown trend of vendor uncertainty would be the costs associated with an upgrade, if it's material. Companies have not been doing that much. I don't know if they will.

MEREDITH: The new guidance came out-- that was in the old guidance. But the new guidance came out too late for this year's 10-K's. And so, perhaps the upcoming future filings will talk about things in the MD&A section on costs. And then, also in theory, you could see a known trend of vendor uncertainty being a risk of customers losing confidence in the company. And they're like, I don't expect the company would do that, would add that to MD&A, that seems highly unlikely.

I think the closest thing that you get towards companies thinking about adding something in MD&A is the cost associated with an upgrade. But I don't know if anybody else has seen anything different. I don't see people adding to MD&A. We may lose our customers because we're not protecting their data.

RICHARD H. WALTERS: So the SEC, from the chairman, the very top has been saying that the level and the quality of disclosure is inadequate. And not really saying very much until this latest release as to exactly how or why. I think, Steve, you've put your finger on it, it's very generic here. It's not particularly company specific. Either of you seen, or any of you seen uplifts in terms of the disclosure? Is it getting better? Do you expect that in this season, there will be people more carefully focusing on their risk factor disclosure, or is it going to be the case that you can kind of pick up a lot of different companies, mask the name and it all sort of sounds the same.

MEREDITH: I saw, I mean, first of all, it was out in time for somebody to at least put something in-- companies to put something in their proxy statements about how cyber is overseen. So that got added to a lot of proxy statements. You can see where on the board that's happening. Not a lot more detail, but some.

And then, the companies that I work with have re-looked at their risk factors to see-- for the cue, I mean, really, that's your free time. Because you didn't have time with the 10-K's, they were already buttoned up by the time the guidance came out. So the 10-Q's have been an opportunity to do a soul searching about whether what you have is good enough. And even if it's just minor tweaks, they can be quite meaningful. So yes, I'm seeing that.

RICHARD H. WALTERS: So the new guidance isn't really any different.

MEREDITH: No, it's the same.

RICHARD H. WALTERS: So wouldn't that in part, I mean, or is it just the drumbeat, the repetition that's causing people to say they really mean this, even though they're not really raising the bar. I think the combination of it being commission level guidance and the SEC making it clear they want to sue people for this. They want to bring enforcement cases for bad disclosure in this area. I think both of those are causing people to focus a bit more.

ALLEN: I think you'll more. There are two sets of companies. There are companies that include their risk factors, everybody puts them in the 10-K and some repeat them in the queue. There are others that don't. For the ones that don't, where are they put in is changes in their risk, there's a tremendous pressure not to do that on a quarter by quarter basis. And so, unless there's something really defective about what's in the K, I think you'll have to wait until the next K to see something from most of those companies.

The other question is, is SEC going to follow up the commission guidance with a comment letter campaign, or a Dear CFO letter? Because that can change the atmosphere around disclosure issues, I think, actually, more potently than almost anything, certainly the staff can do.

MEREDITH: And they've said they're not. They've said that it's just going to be part of the review program, but that it will be an area of focus. That's what they've said so far. Whether that then results in them being disappointed with what they see, and so then they do a Dear CFO letter or something. That may change things.

And part of this is, again, the guidance is just not that different.

RICHARD H. WALTERS: We better move along briskly, because we haven't really hit a crisis yet. So let's let the bomb drop here I guess a little bit. So right after the 10-k is filed and before the end of the company's first quarter, the company's information security team detected some unusual activity that signaled potential attempts to hack the customer first system. The team immediately undertook a number of diagnostic steps to determine whether the system's integrity had been breached. Week of testing did not reveal any actual incursion. So you've just got specific potentials at this point.

So by the end of the queue, the company files its Q and it's published on April 30th. One week after the Q is released and published, filed, the information security team determines that there had been an actual intrusion. But the extent of the intrusion and whether customer information had been obtained remained unknown.

General counsel was briefed and decided to bring in an external forensic firm to help determine the scope and magnitude of the intrusion. After an initial assessment, that firm estimated that there had been limited access to customer personally identifiable information. Less than 50,000 customers were likely affected. Nevertheless, the general counsel decides that it's best to notify the FBI of the intrusion. After consultation with the company's CEO, general counsel recommends that the CEO schedule a report to the board on the matter at the next board meeting in mid-June.

So Shelly, does that sound like a reasonable plan for communicating this up to the board level?

SHELLY J. DROPKIN: It doesn't feel that way to me. Sorry. I think if you've gotten to the point where, first of all, your initial assessment was wrong, you get an external firm in, they do find there's access and you're referring it to the FBI, your board needs to know about that. So to me, this sort of screams-- let's have a call with the board and brief them on this, both to ensure that they are fully informed and that they can assist in providing oversight, but you do not want the board blind sided with this in June when there could be developments that evolve over time between March, or I guess we're in April now and June.

And the fact that the FBI is looking into this and may even want to speak to board members to understand what they know about it. So to me, I don't know why they haven't talked to them yet, but OK. Yes, they should talk to them.

RICHARD H. WALTERS: So Steve, that's a good question. What's your advice? Let's assume that everybody agrees that the board needs to be notified of this, but this is a company that's had sort of daily attempts of intrusion. How do you advise-- what's the line when management should inform the board of what's going on? Is it a periodic levels, or you can't have a board every time somebody attempts to break in. So how do you make decisions on what the right level of communication with the board is?

STEVEN A. ROSENBLUM: Well, I mean, as we've discussed before, the board should have been getting briefings on a regular basis as to this area of risk and what the vulnerabilities were and what the company was doing about it. And if they had done all of that the way they should have, they probably would have addressed the risk and hopefully not have had this crisis in the first place. But once you know you've got a breach that's at the level of tens of thousands of customers with financial information and you're bringing in the FBI, that's a pretty clear line and that's a pretty easy line.

In terms of maybe harder lines. I mean, I think the kind of day to day that they were seeing before where there's one offs here or there would be more something that you'd aggregated into a periodic report and update. But if there's-- there's no bright line, but if it's a critical math, I think if you bring in the FBI, that's pretty clear that the board should be hearing about it real time.

RICHARD H. WALTERS: Do you think the board should have any input into whether to call the FBI, or do you think that's a management decision that should be properly delegated to management?

STEVEN A. ROSENBLUM: I think it's fine for that to be a management decision. Again, I think if you're doing these periodic updates, you might have a board member that has a view as to, certainly, when you bring in an outside forensic firm, this company should have had that years ago. But the board, if it's an ongoing issue or a problem, and particularly if it seems to be coming from foreign sources or something that the FBI can help with, a board member may decide in the context of one of the periodic reports that the FBI ought to be brought in. But I think in a situation like this, it's fine for the management to say, we need to address this immediately. We're going to get the FBI in, we're going to bring the board up to speed pretty much concurrently.

SHELLY J. DROPKIN: And I would suggest calling the head of the audit committee, and if you have an independent chair, calling the independent chair and letting them know at least, so that if anything comes in or comes out, you have leadership of the board who are aware that the FBI has been called in.

ALLEN: They're incremental steps you could take here, I mean, you don't have to call the whole board. I mean, there are lots of companies who keep the head of whatever committee is responsible, plus, either the independent chair or the lead director. I would expect a company that, when they hire a forensic firm to let the audit committee care and the lead director know that. But I don't think you have to look the whole board know that. But I think the lead director and the chair of that-- whatever committee it is-- are your sounding board as to when it's appropriate to go up further. Not to mention self-protective if you're not [INAUDIBLE].

RICHARD H. WALTERS: So we have to also face a pretty important decision about disclosure. Are we at the point where we have to go public with this? Andy, what's your--

ANDREA ROSE: Yeah, I mean, I would say it's interesting that the conversation up of the up until now was talking about keeping the board in the loop. But two things are happening here. One is, there is customer data that's been breached. We think it's 50,000, but we're not quite sure. So for a retailer-- the easiest way for retailers to do a mass notification is to issue a press release. And there are many situations where there isn't always a legal requirement depending on the nature of a hacked issue, a press release. But there can be what we would call a credibility requirement where we actually say, do it, own it, be the first to report it, because being slow to disclose-- if you're disclosing that something happened three months ago, and I think we know plenty of some of the big, high profile breaches were this was the case, you knew this five months ago, six months ago and you didn't tell us, that's a real problem.

RICHARD H. WALTERS: But what do you tell them? You don't really know.

ANDREA ROSE: At this company, I would say, somewhere between the one week and the-- when they figure out that it's 50,000, I think we'd want to better understand the time period there. But our advice, I mean, every situation is different, but rule of thumb I think is, as soon as you have enough information, it doesn't need to be perfect, don't try to put a box around it. So you would not say, 50,000. In fact, we'd say, we're assessing. But own it, own the fact that it's happened and show that you're taking action. So you're going to offer free credit monitoring for these customers, you're going to show that you're taking action to secure the system.

Again, if you waited two weeks and there was a lot more information, the lawyers working on the project and security experts will give you a sense of what the right time is. But I would say waiting three months is a disaster. Just to take a quick show of hands, how many of you in this room know who Brian Krebs is? So Brian Krebs, Krebs on Security-- you can go read it later-- is the lead security blogger on all things cyber. So he notified some of the big retailers about 10 years ago of their big breaches.

So I might argue that this company has had so many problems, what you don't want to happen is for a story to be on Kreb's blog about this and you haven't yet disclosed it.

MEREDITH: I mean, I don't view this one so much as securities law issues or governance issues, it's much more-- customers are going to need to be told. This is one of the breaches that hits customers, and so then you get stuck with, you can't tell the customers without telling the world, because the customers don't have any duty to keep it confidential. And so, it's likely not material and for long periods there was sort of cyber fatigue, nobody cared if a retailer said we've had a breach. I'm not sure how people feel right now with some of the bigger breaches that there have been. But this is one where, if you're going to have to tell customers, then you're going to end up having to tell everybody.

ANDREA ROSE: And it's become so normal, I would say. A lot of times, we tell clients, it's either you've been breached and you know it, or you've been breached and you don't yet know it. For retailers, it is commonplace. So that's why there is-- I would say the wrong way to do it is to have to revise upwards, so every week, it's like, no, it was actually 5 million more.

So either make the number, again, lawyers, so don't kill me, but we'll say, either make the numbers sound enormous and just, it could be this, let's say it, or we're assessing. But don't put a box around something that you're then revising the goalposts, because to the extent this is all just a crisis of trust, you have to show you know transparency. And moving the goalposts really diminishes that credibility.

STEVEN A. ROSENBLUM: And there is a balance in terms of knowing enough that you're not just putting out a disclosure that's completely blind. But I completely agree that if you're going to wait until everything, then you're going to be making that disclosure way too late. So it's a judgment call. But I think you can make a first disclosure that says, it happened, here's what we know in basic outline and we're assessing and remediating. And then, maybe make a disclosure or three months later, where hopefully you've got a full handle on everything that's happened. And as Andy says, don't don't be making weekly updates where you're creeping up every week.

RICHARD H. WALTERS: So Meredith, going back to the disclosure that remained uncorrected for about two years in the 10-K, now, it's really in your face wrong to the extent that it wasn't before. Don't you have to take care of that too?

MEREDITH: Well, when you do your next 10-Q, assuming you--

RICHARD H. WALTERS: They just miss that.

MEREDITH: I know. If they're going to be doing some sort of offering and they've done an incorporation by reference, I'm sure they need to put in something to fix it for incorporation by reference. But you would be dealing with a disclosure problem now through a public disclosure relating to the breach. And then, when you do your next filing, you would say, for example, in such and so, as we disclosed, we were breached and we're improving our systems.

ALLEN: We're run out of time here, let's get to the--

RICHARD H. WALTERS: Yeah, the exciting, dramatic conclusion is, and I think it's been anticipated here, that the forensic firm continues its review for several more weeks. And surprise, surprise, the estimated number of customers that were affected grows from 50,000 to 500,000 or more. System vulnerabilities were also still being remediated, but had not yet been fully resolved. So now you've got the point where you have to call an outside counsel.

STEVEN A. ROSENBLUM: I think you should have done that a while ago.

MEREDITH: Can I first correct-- if that disclosure before was wrong when made, then they should do a 10-QA. I'm sorry, I should have said that. If it was wrong when made, I think that's a question.

RICHARD H. WALTERS: So lots and lots more issues that unfortunately, we didn't get a chance to do, such as the tricky issues of how and who do you notify in law enforcement? You've already notified the FBI. For companies that are regulated here in New York by the New York Department of Financial Services, there is a very aggressive new requirement that notifications be made and annual certifications be made by boards or board designees at senior levels. And it still remains open whether others are going to follow the lead of the New York Department of Financial Services. But my guess is that there's going to be much more likely required disclosures to different regulators when these kinds of incidents happen.

So I don't know whether any last words of advice from the panel on what needs to be done, but I'm afraid we've just kind of run out of time.

ALLEN: We have. And I want to thank the panel, this was great. And this is real world stuff at this point. So pay attention.

MEREDITH: And scary.

ALLEN: Thank you all.

MEREDITH: Thank you.

ALLEN: We're going to change on the fly here to our audit committee panel.


  • twitter
  • LinkedIn
  • YouTube
  • RSS

All Contents Copyright © 1996-2018 Practising Law Institute. Continuing Legal Education since 1933.

© 2018 PLI PRACTISING LAW INSTITUTE. All rights reserved. The PLI logo is a service mark of PLI.